-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(backend): adding s3 supports, and also expose uploading project images #154
Conversation
…ment configuration
…ds and add Upload scalar type
…d modify project service to use S3 client
…th environment variables
…egration, update environment configuration
… functionality in project service
… and remove old photo URL mutation
WalkthroughThis pull request updates configuration, dependency management, and file upload logic across both backend and frontend projects. In the backend, Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant Resolver as "ProjectsResolver"
participant Service as "ProjectService"
participant Uploader as "UploadService"
participant Storage as "S3/Local Storage"
Client->>Resolver: updateProjectPhoto(input: { projectId, file })
Resolver->>Service: updateProjectPhotoUrl(userId, projectId, fileBuffer, mimeType)
Service->>Uploader: upload(fileBuffer, mimeType)
Uploader->>Storage: Upload file
Storage-->>Uploader: Return uploadResult (URL)
Uploader-->>Service: Return uploadResult
Service-->>Resolver: Return updated project with new photo URL
Resolver-->>Client: Respond with updated project
Possibly related PRs
Suggested reviewers
Poem
Warning There were issues while running some tools. Please review the errors and either fix the tool’s configuration or disable the tool if it’s a critical failure. 🔧 ESLint
backend/src/project/project.resolver.tsOops! Something went wrong! :( ESLint: 8.57.1 ESLint couldn't find the plugin "eslint-plugin-prettier". (The package "eslint-plugin-prettier" was not found when loaded as a Node module from the directory "/backend".) It's likely that the plugin isn't installed correctly. Try reinstalling by running the following:
The plugin "eslint-plugin-prettier" was referenced from the config file in "backend/.eslintrc.js". If you still can't figure out the problem, please stop by https://eslint.org/chat/help to chat with the team. backend/src/project/project.service.tsOops! Something went wrong! :( ESLint: 8.57.1 ESLint couldn't find the plugin "eslint-plugin-prettier". (The package "eslint-plugin-prettier" was not found when loaded as a Node module from the directory "/backend".) It's likely that the plugin isn't installed correctly. Try reinstalling by running the following:
The plugin "eslint-plugin-prettier" was referenced from the config file in "backend/.eslintrc.js". If you still can't figure out the problem, please stop by https://eslint.org/chat/help to chat with the team. ✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
🧹 Nitpick comments (17)
backend/src/main.ts (1)
27-30
: Consider adding rate limiting for file uploads.The current configuration allows up to 10 files per request but doesn't include rate limiting, which could lead to abuse or DoS attacks targeting your upload endpoint.
Consider adding a rate limiter middleware for the GraphQL endpoint, especially for file upload operations:
import { ThrottlerGuard, ThrottlerModule } from '@nestjs/throttler'; import { APP_GUARD } from '@nestjs/core'; // In your AppModule imports: [ ThrottlerModule.forRoot([{ ttl: 60000, // 1 minute limit: 10, // 10 requests per minute for file uploads }]), ], providers: [ { provide: APP_GUARD, useClass: ThrottlerGuard, }, ]backend/src/config/config.module.ts (2)
8-22
: Validation function handles environment variables correctly, but consider improving error formatting.The validation function correctly transforms and validates the configuration object. However, the error message when validation fails could be improved for better debuggability.
Consider enhancing the error message formatting to provide more detailed information:
- if (errors.length > 0) { - throw new Error(errors.toString()); - } + if (errors.length > 0) { + const formattedErrors = errors.map(error => { + const constraints = error.constraints ? Object.values(error.constraints) : []; + return `${error.property}: ${constraints.join(', ')}`; + }).join('\n'); + throw new Error(`Environment validation failed:\n${formattedErrors}`); + }
24-34
: Module is properly configured, but should document the configuration source.The AppConfigModule is correctly set up with global scope and validates environment variables, but it doesn't explicitly specify where to load the variables from.
Consider adding a comment or documentation that explains the configuration source:
@Module({ imports: [ NestConfigModule.forRoot({ validate, isGlobal: true, + // Loads environment variables from .env files and process.env }), ], providers: [AppConfigService], exports: [AppConfigService], }) export class AppConfigModule {}
backend/src/config/env.validation.ts (2)
24-30
: Improve security for S3 credentials.S3 credentials are sensitive information and should be properly secured.
Consider using environment variable injection at runtime or a secure vault service for production environments rather than storing these values in .env files. For local development, ensure .env files are properly excluded from version control (check .gitignore).
#!/bin/bash # Verify .env files are properly excluded in .gitignore echo "Checking .gitignore for proper exclusion of .env files..." cat .gitignore | grep -E "\.env|env\."
36-50
: Enhance validation for URL-based environment variables.The S3_PUBLIC_URL and S3_ENDPOINT could benefit from more specific validation rules.
Consider using IsUrl decorator for URL validation:
@IsOptional() @IsString() + @IsUrl({ require_tld: false }) S3_ENDPOINT?: string; @IsOptional() @IsString() S3_ACCOUNT_ID?: string; @IsOptional() @IsString() + @IsUrl({ require_tld: false }) S3_PUBLIC_URL?: string;Don't forget to import IsUrl:
-import { IsOptional, IsString, IsNumber, IsIn } from 'class-validator'; +import { IsOptional, IsString, IsNumber, IsIn, IsUrl } from 'class-validator';backend/src/project/dto/project.input.ts (2)
123-132
: File upload input type implemented properly.The
UpdateProjectPhotoInput
class is well-designed for handling file uploads:
- It correctly uses GraphQL and validation decorators
- It makes the file field optional with
@IsOptional()
- It properly types the file as
FileUpload
from the graphql-upload packageOne additional validation check to consider would be file size or type validation, especially for images.
Consider adding file type and size validation to prevent large or invalid file uploads:
@IsOptional() + @ValidateIf(o => o.file !== undefined) + @IsFile({ mime: ['image/jpeg', 'image/png', 'image/gif', 'image/webp'] }) + @MaxFileSize(5 * 1024 * 1024) // 5MB limit @Field(() => GraphQLUpload) file: FileUpload;This would require adding additional validation decorators from a package like
nestjs-form-data
or creating custom validators.
81-88
: Consider deprecating the URL-based input type.Since you're introducing a new file upload input type
UpdateProjectPhotoInput
, consider deprecating theUpdateProjectPhotoUrlInput
class if it will be eventually replaced. This will help guide developers to use the new approach.@InputType() +/** + * @deprecated Use UpdateProjectPhotoInput with file upload instead + */ export class UpdateProjectPhotoUrlInput { @Field(() => ID) projectId: string; @Field() photoUrl: string; }backend/src/config/config.service.ts (1)
47-57
: S3 configuration is thorough but lacks validation.The S3 configuration object covers all necessary parameters for both standard S3 and Cloudflare R2. However, it doesn't validate or provide default values for these parameters.
Consider adding validation or default values for non-critical S3 parameters:
get s3Config() { return { accessKeyId: this.configService.get('S3_ACCESS_KEY_ID'), secretAccessKey: this.configService.get('S3_SECRET_ACCESS_KEY'), region: this.configService.get('S3_REGION'), bucketName: this.configService.get('S3_BUCKET_NAME'), endpoint: this.configService.get('S3_ENDPOINT'), accountId: this.configService.get('S3_ACCOUNT_ID'), - publicUrl: this.configService.get('S3_PUBLIC_URL'), + publicUrl: this.configService.get('S3_PUBLIC_URL') || + `https://${this.configService.get('S3_BUCKET_NAME')}.${this.configService.get('S3_REGION')}.amazonaws.com`, }; }frontend/src/lib/client.ts (1)
18-25
: Upload link configuration looks good, but consider additional options.The upload link is correctly configured to replace the standard HTTP link. The
Apollo-Require-Preflight: true
header is important for Apollo to handle file uploads properly.Consider adding additional upload options for better error handling and user experience:
const uploadLink = createUploadLink({ uri: process.env.NEXT_PUBLIC_GRAPHQL_URL || 'http://localhost:8080/graphql', headers: { 'Apollo-Require-Preflight': 'true', 'Access-Control-Allow-Credentials': 'true', 'Access-Control-Allow-Origin': '*', }, + // These options can improve the upload experience + includeExtensions: true, + fetchOptions: { + credentials: 'include', + }, });backend/src/upload/upload.service.ts (4)
21-40
: Consider handling partial or invalid S3 configurations more robustly.While
hasS3Configured
likely checks for completeness, there may be cases where only a subset of credentials or an invalid region is supplied. Consider validating all required fields and throwing an explicit error if the configuration is partially invalid. This ensures early failure and clearer error messages.
72-86
: Defaulting to 'jpg' extension may lead to mismatched file formats.You currently split the mimetype by
'/'
and default to'jpg'
. If a user uploads a PNG file without specifying a proper mimetype, the file might be mislabeled. Consider using the original file extension if available, or at least warn that'jpg'
is a fallback:- const fileExtension = mimetype?.split('/')[1] || 'jpg'; + const extensionFromMimetype = mimetype?.split('/')[1]; + const fileExtension = extensionFromMimetype === undefined ? 'jpg' : extensionFromMimetype;
88-119
: Watch out for repetitive code in S3 vs local fallback logic.The logic for storing the file (either S3 or local) largely mirrors the pattern used for GraphQL streams below. To reduce duplication, consider creating a helper function or merging both buffer-based and stream-based logic. This yields cleaner code, avoids subtle bugs, and improves maintainability.
120-159
: Potential high memory usage for large uploads.By converting the entire GraphQL file stream into a buffer, you risk high RAM usage on large files. Consider piping your
createReadStream
directly into an S3 or local file write stream. This streaming approach can greatly reduce memory requirements by handling the file in chunks.backend/src/project/project.resolver.ts (1)
96-107
: Consider streaming file data directly to the service.Manually buffering the file within the resolver introduces an extra step. Since the upload service also offers a buffer-based approach, you might unify or streamline logic to offload all reading/writing concerns to the service. This can help minimize memory usage in the resolver layer.
frontend/src/graphql/type.tsx (1)
340-343
: Consider adding validation constraints or documentation for file uploads.The input type is well-structured, but it would benefit from additional information about supported file types, size limits, or other constraints.
Adding documentation comments would help developers understand the upload limitations:
export type UpdateProjectPhotoInput = { + /** Supported formats: JPG, PNG, WebP. Max size: 5MB */ file: Scalars['Upload']['input']; projectId: Scalars['ID']['input']; };
backend/src/project/project.service.ts (2)
433-438
: Method name should match the GraphQL mutation name.The method name
updateProjectPhotoUrl
is inconsistent with the GraphQL mutation nameupdateProjectPhoto
and doesn't accurately reflect the new file upload functionality.Consider renaming the method to align with the mutation:
- async updateProjectPhotoUrl( + async updateProjectPhoto( userId: string, projectId: string, file: Buffer, mimeType: string, ): Promise<Project> {
461-464
: Consider adding more detailed error handling.The error handling is good, but the error message could be more specific to help with debugging.
Enhance the error message with more details:
} catch (error) { this.logger.error('Error uploading image:', error); - throw new InternalServerErrorException('Failed to upload image:', error); + throw new InternalServerErrorException( + `Failed to upload image for project ${projectId}: ${error.message}`, + ); }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (24)
.gitignore
(1 hunks)backend/.env
(0 hunks)backend/.env.development
(0 hunks)backend/.env.example
(1 hunks)backend/.gitignore
(1 hunks)backend/package.json
(3 hunks)backend/src/config/config.module.ts
(1 hunks)backend/src/config/config.service.ts
(1 hunks)backend/src/config/env.validation.ts
(1 hunks)backend/src/guard/project.guard.ts
(1 hunks)backend/src/main.ts
(2 hunks)backend/src/project/dto/project.input.ts
(2 hunks)backend/src/project/project.module.ts
(1 hunks)backend/src/project/project.resolver.ts
(2 hunks)backend/src/project/project.service.ts
(3 hunks)backend/src/upload/upload.module.ts
(1 hunks)backend/src/upload/upload.service.ts
(1 hunks)frontend/next.config.mjs
(1 hunks)frontend/package.json
(1 hunks)frontend/src/graphql/request.ts
(0 hunks)frontend/src/graphql/schema.gql
(4 hunks)frontend/src/graphql/type.tsx
(9 hunks)frontend/src/lib/client.ts
(3 hunks)llm-server/.env
(0 hunks)
💤 Files with no reviewable changes (4)
- frontend/src/graphql/request.ts
- llm-server/.env
- backend/.env
- backend/.env.development
✅ Files skipped from review due to trivial changes (3)
- backend/src/guard/project.guard.ts
- backend/.env.example
- .gitignore
🔇 Additional comments (33)
frontend/next.config.mjs (1)
31-31
: Security consideration: wildcard hostname allows images from any domain.Changing the hostname from a specific domain to a wildcard ('*') significantly broadens where images can be loaded from. While this supports loading from S3/CloudFront/R2 domains, it also introduces potential security risks:
- It could enable loading of malicious images (used for tracking or in rare cases, client-side exploits)
- It increases vulnerability to hotlinking (others using your bandwidth to serve their content)
Consider if you can use a more restrictive pattern that covers just your required domains:
- hostname: '*', + hostname: '*.s3.amazonaws.com', // Or the specific pattern for your bucketOr add multiple patterns for each required domain:
remotePatterns: [ { protocol: 'https', hostname: 'your-bucket.s3.amazonaws.com', pathname: '/**', }, { protocol: 'https', hostname: 'your-cdn-domain.com', pathname: '/**', }, ]backend/.gitignore (1)
59-59
: Good practice: Ignoring environment files to prevent credential leakage.Adding
.env
to the gitignore file is a security best practice that prevents sensitive credentials from being accidentally committed to source control.Make sure you've also provided a comprehensive
.env.example
file so developers know what environment variables need to be configured.backend/src/upload/upload.module.ts (1)
1-10
: Well-structured module for upload functionality following NestJS conventions.This module follows NestJS best practices by:
- Properly importing dependencies
- Organizing related functionality into a cohesive module
- Exporting the service for use by other modules
The
UploadService
will be responsible for handling file uploads, likely to both S3 and local storage based on the PR title. This modular approach makes the code more maintainable and testable.backend/src/main.ts (3)
6-6
: Good: Using modern GraphQL upload package.Using
graphql-upload-minimal
is a good choice for handling file uploads in GraphQL as it's maintained and optimized compared to some alternatives.
22-22
: Added header for Apollo client upload support.The
Apollo-Require-Preflight
header is required for Apollo Client to properly handle file uploads. This allows the client to prepare the proper multipart request.
27-30
:❓ Verification inconclusive
Security consideration: Maximum file size configuration.
You've configured a 50MB maximum file size limit, which is appropriate for many use cases. However, you should verify this aligns with:
- Your S3 bucket policies
- Your expected image sizes
- Backend resource constraints
Consider if 50MB is the right size limit for your application's requirements. Also, ensure you're validating file types on the backend to prevent malicious file uploads.
🏁 Script executed:
#!/bin/bash # Check if file type validation exists in the upload service rg -A 5 "file.mimetype" --include "*.ts" ./src/uploadLength of output: 495
Security Configuration Verification
- The 50MB file size limit in
backend/src/main.ts
(lines 27–30) appears appropriate based on our current settings, but please ensure that:
- Your S3 bucket policies permit uploads of this size.
- The expected image sizes in your use case align with this limit.
- Your backend resource constraints can handle the configured maximum file size.
- The initial check for file type validation in the upload service did not yield conclusive results due to a command issue. Please verify manually (or re-run the corrected script below) to ensure file type validations are implemented to prevent malicious uploads.
Action Items:
- Manually verify that file type validation exists in your upload logic (e.g., by checking for handling of
file.mimetype
or similar checks in the relevant files under./src/upload
).- Consider updating your automated search commands if needed.
🏁 Script executed:
#!/bin/bash # Re-run the file type validation check using the correct ripgrep flag. rg -A 5 "file.mimetype" -g '*.ts' ./src/uploadLength of output: 221
Action Required: Verify Maximum File Size and File Type Validation
- The maximum file size limit (50MB) in
backend/src/main.ts
(lines 27–30) appears reasonable, but please ensure that:
- Your S3 bucket policies allow for uploads of this size.
- The limit aligns with your expected image/file sizes.
- Your backend infrastructure can effectively handle such large file uploads.
- Our attempt to locate file type validation (e.g., checks referencing
file.mimetype
) in the expected location (./src/upload
) did not yield results. Please manually verify whether file type validation is implemented elsewhere in the codebase. If it’s missing, consider adding proper file type checks to guard against malicious uploads.backend/src/config/config.module.ts (1)
1-7
: Correct import structure for NestJS configuration.The file correctly imports the necessary modules and services for configuration management, including the ConfigModule from @nestjs/config (renamed to NestConfigModule for clarity), and utilities for validation.
backend/package.json (2)
47-47
: Class-transformer dependency added correctly.The class-transformer library is correctly added with a specific version, which is necessary for the configuration validation implemented in the config module.
29-29
:✅ Verification successful
Verify AWS SDK version security and compatibility.
The AWS SDK for S3 has been added, which is necessary for S3 integration, but you should ensure this version is secure and compatible with your application.
🌐 Web query:
Are there any known security vulnerabilities in @aws-sdk/client-s3 version 3.758.0?
💡 Result:
The @aws-sdk/client-s3 version 3.758.0 doesn't have any direct security vulnerabilities reported in vulnerability databases like Snyk or npm advisories as of March 2025[6][9].
While earlier versions of AWS SDK v3 dependencies faced security alerts related to packages like fast-xml-parser (as seen in July 2024 reports)[3], these issues were resolved in subsequent updates. The latest releases, including 3.758.0, show no active vulnerabilities in dependency scans[6][9].
For ongoing security best practices:
- Monitor repositories like Snyk or npm advisories
- Review AWS SDK release notes for patches[4]
- Use dependency management tools to audit your project's full dependency tree
Always verify transitive dependencies, as vulnerabilities might arise in nested packages not directly maintained by AWS.
Citations:
- 1: https://community.cloudflare.com/t/aws-sdk-client-s3-v3-729-0-breaks-uploadpart-and-putobject-r2-s3-api-compatibility/758637
- 2: https://www.youtube.com/watch?v=w9YhVPepH0k
- 3: Recent fast-xml-parser vulnerability now shows dozens of high severity vulnerabilities during
npm install
aws/aws-sdk-js-v3#6331- 4: https://github.com/aws/aws-sdk-js-v3/releases
- 5: https://github.com/aws/aws-sdk-js-v3/issues
- 6: https://security.snyk.io/package/npm/@aws-sdk%2Fclient-s3
- 7: https://www.npmjs.com/package/@aws-sdk/s3-request-presigner
- 8: https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/Package/-aws-sdk-client-s3/Interface/GetObjectRequest
- 9: https://snyk.io/test/npm/@aws-sdk/credential-provider-ini@3.758.0
- 10: https://www.npmjs.com/package/@aws-sdk/client-s3
AWS SDK Version Security Verified
The version
@aws-sdk/client-s3@3.758.0
inbackend/package.json
(line 29) has been reviewed against current security databases and shows no direct vulnerabilities. This version is confirmed to be secure for integration. Still, as a best practice, continue monitoring transitive dependencies using tools like Snyk and npm advisories.backend/src/project/project.module.ts (2)
12-13
: Import of necessary modules for the upload feature.The imports of AppConfigModule and UploadModule are necessary for the S3 upload functionality and align with the PR objectives.
17-21
:✅ Verification successful
Module imports correctly structured, but check for circular dependencies.
The imports array is correctly updated to include the necessary modules, but verify there are no circular dependencies.
🏁 Script executed:
#!/bin/bash # Check for potential circular dependencies echo "Checking for potential circular dependencies..." # Check if UploadModule imports ProjectModule rg -i "import.*ProjectModule" --glob "*upload.module.ts" --no-ignore # Check for other potential circular dependencies rg -i "import.*from 'src/project'" --glob "*/upload/*.ts" --no-ignoreLength of output: 402
Below is an updated shell script that uses a file search (via
find
) to reverify potential circular dependencies. This script will:
- Look up all files named
upload.module.ts
and check if any of them importProjectModule
.- Similarly, find the
project.module.ts
files and check if any of them importUploadModule
.Please run this script for further verification.
🏁 Script executed:
#!/bin/bash echo "Finding upload.module.ts files..." find backend/src -type f -name "upload.module.ts" echo "" echo "Checking upload.module.ts for references to ProjectModule..." find backend/src -type f -name "upload.module.ts" -exec grep -Hin "import.*ProjectModule" {} \; echo "" echo "Finding project.module.ts files..." find backend/src -type f -name "project.module.ts" echo "" echo "Checking project.module.ts for references to UploadModule..." find backend/src -type f -name "project.module.ts" -exec grep -Hin "import.*UploadModule" {} \;Length of output: 921
Verified Module Imports – No Circular Dependencies Detected
The module imports in
backend/src/project/project.module.ts
are correctly structured. Our verification confirms that although the project module importsUploadModule
, there is no reciprocal import fromUploadModule
toProjectModule
, so no circular dependency exists between them.backend/src/config/env.validation.ts (1)
3-21
: Environment variables validation is correctly implemented.The class correctly defines and validates essential environment variables with appropriate validation rules.
frontend/package.json (2)
39-41
: Apollo upload client dependencies added correctly.The addition of
@types/apollo-upload-client
andapollo-upload-client
packages is appropriate for implementing file upload functionality with Apollo Client. Both packages are at version 18.0.0, which ensures compatibility.
44-44
:❓ Verification inconclusive
Verify potential side-effects of dependency repositioning.
The
codefox-common
workspace dependency has been repositioned in the dependencies list. While the order of dependencies doesn't typically affect functionality, you should ensure that there are no build tool integrations that depend on this specific ordering.
🏁 Script executed:
#!/bin/bash # Check for any build scripts or configurations that might reference dependency ordering rg -l "codefox-common" --type=yaml --type=json --type=js --glob="!package.json" frontend/Length of output: 89
Manual Verification Required: Confirm No Build Tool Dependency on Ordering
The change in
frontend/package.json
moves"codefox-common": "workspace:*"
to a different position in the dependencies list. Although the automated search (usingrg
) did not reveal any build scripts or configuration files that referencecodefox-common
outside ofpackage.json
, there remains a possibility—albeit a small one—that a build integration might implicitly rely on dependency ordering.Please ensure through manual inspection (or additional testing) that no tool or custom build script is affected by the repositioning.
backend/src/project/dto/project.input.ts (1)
5-5
: Import from graphql-upload-minimal added correctly.The import of
FileUpload
andGraphQLUpload
types fromgraphql-upload-minimal
is appropriate for supporting file upload functionality.backend/src/config/config.service.ts (1)
1-8
: Configuration service implementation follows best practices.The
AppConfigService
is correctly implemented with dependency injection and wraps the NestJS's ConfigService. It provides type-safe access to environment variables.frontend/src/lib/client.ts (3)
14-14
: Import path for createUploadLink is correct and future-proof.The import specifically uses the
.mjs
extension, which is proper for modern ESM packages and ensures compatibility with different bundlers.
89-94
: HTTP link chain is implemented correctly.The composition of links using
from()
ensures that all middleware (error handling, logging, auth) is properly applied before the upload link. The TypeScript cast is necessary and correctly implemented.
96-109
: Split link implementation for WebSocket and HTTP is correct.The logic to route GraphQL operations between WebSocket (for subscriptions) and HTTP (for queries and mutations) has been properly updated to use the new link chain.
backend/src/upload/upload.service.ts (3)
1-20
: Good introduction of the UploadService class.This new service cleanly encapsulates file-upload logic, distinguishing between local storage and S3 with an optional S3 client. Good use of NestJS'
@Injectable()
annotation and typed imports.
167-175
:streamToBuffer
implementation looks correct.The implementation of gathering chunks and concatenating them is standard. Just be aware that for very large files, storing the entire file in memory may not be ideal, as mentioned earlier. Otherwise, this is a clean and straightforward method.
181-193
: Solid approach for constructing a bucket URL.The fallback logic to either use
publicUrl
, or build a Cloudflare endpoint URL, is clear and well-structured. This should handle various S3 providers consistently.frontend/src/graphql/schema.gql (2)
57-57
: Single-line docstring for Date scalar is fine.Condensing the docstring to one line is purely stylistic and does not affect functionality. No issues here.
116-116
: Nice consolidation of the project photo update logic.Using
updateProjectPhoto(input: UpdateProjectPhotoInput!)
with a dedicatedfile: Upload!
field is more expressive than separate arguments. The newUpload
scalar is well-defined, improving clarity and enabling direct file handling.Also applies to: 211-214, 216-216
backend/src/project/project.resolver.ts (1)
90-92
: Good use of ProjectGuard for access control.Ensuring only authorized users can update the project photo is essential. This guard pattern is consistent with other project mutations, keeping your security posture strong.
frontend/src/graphql/type.tsx (4)
39-40
: Proper scalar addition for file upload support.The
Upload
scalar type is correctly defined for handling file uploads in GraphQL. This is an appropriate pattern that follows Apollo GraphQL best practices.
159-159
: Well-named mutation for the new file upload functionality.The rename from
updateProjectPhotoUrl
toupdateProjectPhoto
accurately reflects the change in functionality from accepting a URL to accepting a file upload.
215-216
: Good input structure for the mutation.Using a structured input object with the
input
parameter name is a best practice for GraphQL mutations, providing better organization and extensibility.
913-916
: Correctly implemented scalar configuration.The
UploadScalarConfig
interface properly extends the GraphQL scalar type configuration, which is necessary for custom scalar types.backend/src/project/project.service.ts (4)
29-29
: Good service integration for file uploads.The import of the
UploadService
is appropriate for implementing the file upload functionality.
44-44
: Proper dependency injection.The
UploadService
is correctly injected following NestJS dependency injection patterns.
446-451
: Good organization of uploaded files.The use of a structured subdirectory path (
projects/${projectId}/images
) provides a clean organization of project images, making future file management easier.
456-458
: Appropriate logging for debugging.The debug log message provides useful information for tracking photo URL updates, which is helpful for troubleshooting.
@@ -51,6 +53,7 @@ | |||
"gpt-3-encoder": "^1.1.4", | |||
"graphql": "^16.9.0", | |||
"graphql-subscriptions": "^2.0.0", | |||
"graphql-upload-minimal": "^1.6.1", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
GraphQL upload package added correctly but consider security implications.
The graphql-upload-minimal package is correctly added for handling file uploads in GraphQL, but ensure you implement proper file size limits and validation.
Make sure to:
- Implement file size limits
- Validate file types
- Scan uploads for malware if possible
- Set appropriate CORS policies for upload endpoints
Here's a shell script to check if file size limits are implemented:
🏁 Script executed:
#!/bin/bash
# Check if file size limits are implemented for uploads
echo "Checking for file size limit configurations..."
# Look for file size limit configurations in main.ts or similar files
rg -i "file.*size|maxFileSize|sizeLimit" --glob "*.ts" --no-ignore
# Check if graphqlUploadExpress is configured with maxFileSize
rg -i "graphqlUploadExpress.*maxFileSize" --glob "*.ts" --no-ignore
Length of output: 415
Action: Enhance Security for GraphQL File Uploads
The dependency in backend/package.json is added correctly and the file size limit is actively configured in backend/src/main.ts via:
graphqlUploadExpress({ maxFileSize: 50000000, maxFiles: 10 })
However, please ensure the following additional security measures are implemented:
- File Type Validation: Verify that uploads are checked against allowed file types to prevent unwanted content.
- Malware Scanning: Consider integrating malware scanning for uploaded files.
- CORS Policies: Confirm that the upload endpoints enforce appropriate CORS policies.
// S3/Cloudflare R2 Configuration - all optional | ||
@IsOptional() | ||
@IsString() | ||
S3_ACCESS_KEY_ID?: string; | ||
|
||
@IsOptional() | ||
@IsString() | ||
S3_SECRET_ACCESS_KEY?: string; | ||
|
||
@IsOptional() | ||
@IsString() | ||
S3_REGION?: string; | ||
|
||
@IsOptional() | ||
@IsString() | ||
S3_BUCKET_NAME?: string; | ||
|
||
@IsOptional() | ||
@IsString() | ||
S3_ENDPOINT?: string; | ||
|
||
@IsOptional() | ||
@IsString() | ||
S3_ACCOUNT_ID?: string; | ||
|
||
@IsOptional() | ||
@IsString() | ||
S3_PUBLIC_URL?: string; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
🧩 Analysis chain
S3 configuration is optional but should have local fallback implementation.
Making S3 configuration optional is good for flexibility, but ensure there's a fallback for storing files locally when S3 is not configured.
Ensure the upload service has a fallback mechanism when S3 credentials aren't provided. You should implement a strategy pattern that can switch between S3 storage and local file storage based on configuration.
Let's check if this is already implemented:
🏁 Script executed:
#!/bin/bash
# Check if there's a fallback mechanism for local file storage
echo "Checking for local file storage fallback implementation..."
# Look for conditional storage selection based on config
rg -A 10 -B 10 "S3_" --glob "*upload*.ts" --no-ignore
# Check for filesystem operations indicating local storage
rg -i "fs\.|writeFile|createReadStream|createWriteStream" --glob "*upload*.ts" --no-ignore
Length of output: 863
Action Required: Implement Conditional Storage Strategy
The environment validation in backend/src/config/env.validation.ts
correctly marks all S3 configuration values as optional, and local file storage functionality is present in backend/src/upload/upload.service.ts
(e.g., use of fs-extra
methods for file operations). However, there isn’t any conditional logic (or strategy pattern) to select S3 storage when credentials are provided versus falling back to local storage when they’re not.
- Ensure the upload service dynamically switches between S3 and local file storage based on your configuration.
- Consider refactoring the service to use a strategy pattern so that when valid S3 credentials are available, file uploads are directed to S3; otherwise, the service should reliably use the local fallback implementation.
get hasS3Configured(): boolean { | ||
const config = this.s3Config; | ||
return !!( | ||
config.accessKeyId && | ||
config.secretAccessKey && | ||
config.region && | ||
(config.endpoint || config.accountId) | ||
); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
S3 configuration validation is incomplete.
The hasS3Configured
method checks for the presence of mandatory fields but doesn't verify if the bucket name is set, which is essential for S3 operations.
Add bucket name validation to the S3 configuration check:
get hasS3Configured(): boolean {
const config = this.s3Config;
return !!(
config.accessKeyId &&
config.secretAccessKey &&
config.region &&
+ config.bucketName &&
(config.endpoint || config.accountId)
);
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
get hasS3Configured(): boolean { | |
const config = this.s3Config; | |
return !!( | |
config.accessKeyId && | |
config.secretAccessKey && | |
config.region && | |
(config.endpoint || config.accountId) | |
); | |
} | |
get hasS3Configured(): boolean { | |
const config = this.s3Config; | |
return !!( | |
config.accessKeyId && | |
config.secretAccessKey && | |
config.region && | |
config.bucketName && | |
(config.endpoint || config.accountId) | |
); | |
} |
/** | ||
* Get server port from environment | ||
*/ | ||
get port(): number { | ||
return this.configService.get('PORT'); | ||
} | ||
|
||
/** | ||
* Get JWT secret key for token generation | ||
*/ | ||
get jwtSecret(): string { | ||
return this.configService.get('JWT_SECRET'); | ||
} | ||
|
||
/** | ||
* Get JWT refresh token secret | ||
*/ | ||
get jwtRefresh(): string { | ||
return this.configService.get('JWT_REFRESH'); | ||
} | ||
|
||
/** | ||
* Get password hashing salt rounds | ||
*/ | ||
get saltRounds(): number { | ||
return this.configService.get('SALT_ROUNDS'); | ||
} | ||
|
||
/** | ||
* Get OpenAI API base URI | ||
*/ | ||
get openaiBaseUri(): string { | ||
return this.configService.get('OPENAI_BASE_URI'); | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Config getters can throw if environment variables are missing.
The config getters directly access environment variables without checking if they exist. This could lead to runtime errors if required environment variables are not set.
Consider adding default values or explicit error handling:
get jwtSecret(): string {
- return this.configService.get('JWT_SECRET');
+ const secret = this.configService.get('JWT_SECRET');
+ if (!secret) {
+ throw new Error('JWT_SECRET environment variable is not set');
+ }
+ return secret;
}
Or use the built-in ConfigService default value capability:
get jwtSecret(): string {
- return this.configService.get('JWT_SECRET');
+ return this.configService.get('JWT_SECRET', 'default-secret-for-development-only');
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
/** | |
* Get server port from environment | |
*/ | |
get port(): number { | |
return this.configService.get('PORT'); | |
} | |
/** | |
* Get JWT secret key for token generation | |
*/ | |
get jwtSecret(): string { | |
return this.configService.get('JWT_SECRET'); | |
} | |
/** | |
* Get JWT refresh token secret | |
*/ | |
get jwtRefresh(): string { | |
return this.configService.get('JWT_REFRESH'); | |
} | |
/** | |
* Get password hashing salt rounds | |
*/ | |
get saltRounds(): number { | |
return this.configService.get('SALT_ROUNDS'); | |
} | |
/** | |
* Get OpenAI API base URI | |
*/ | |
get openaiBaseUri(): string { | |
return this.configService.get('OPENAI_BASE_URI'); | |
} | |
/** | |
* Get server port from environment | |
*/ | |
get port(): number { | |
return this.configService.get('PORT'); | |
} | |
/** | |
* Get JWT secret key for token generation | |
*/ | |
get jwtSecret(): string { | |
const secret = this.configService.get('JWT_SECRET'); | |
if (!secret) { | |
throw new Error('JWT_SECRET environment variable is not set'); | |
} | |
return secret; | |
} | |
/** | |
* Get JWT refresh token secret | |
*/ | |
get jwtRefresh(): string { | |
return this.configService.get('JWT_REFRESH'); | |
} | |
/** | |
* Get password hashing salt rounds | |
*/ | |
get saltRounds(): number { | |
return this.configService.get('SALT_ROUNDS'); | |
} | |
/** | |
* Get OpenAI API base URI | |
*/ | |
get openaiBaseUri(): string { | |
return this.configService.get('OPENAI_BASE_URI'); | |
} |
try { | ||
// Use the upload service to handle the file upload | ||
const subdirectory = `projects/${projectId}/images`; | ||
const uploadResult = await this.uploadService.upload( | ||
file, | ||
mimeType, | ||
subdirectory, | ||
); | ||
|
||
return this.projectsRepository.save(project); | ||
// Update the project with the new URL | ||
project.photoUrl = uploadResult.url; | ||
|
||
this.logger.debug( | ||
`Updated photo URL for project ${projectId} to ${uploadResult.url}`, | ||
); | ||
|
||
return this.projectsRepository.save(project); | ||
} catch (error) { | ||
this.logger.error('Error uploading image:', error); | ||
throw new InternalServerErrorException('Failed to upload image:', error); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Implement file validation for enhanced security.
While the upload implementation is functional, it lacks validation for file type, size, and content, which could pose security risks.
Add validation before processing the upload:
try {
+ // Validate file type
+ const allowedMimeTypes = ['image/jpeg', 'image/png', 'image/webp', 'image/gif'];
+ if (!allowedMimeTypes.includes(mimeType)) {
+ throw new BadRequestException('Unsupported file type. Only JPEG, PNG, WebP, and GIF are allowed.');
+ }
+
+ // Validate file size (e.g., 5MB limit)
+ const maxSize = 5 * 1024 * 1024; // 5MB in bytes
+ if (file.length > maxSize) {
+ throw new BadRequestException('File size exceeds the maximum limit of 5MB.');
+ }
// Use the upload service to handle the file upload
const subdirectory = `projects/${projectId}/images`;
const uploadResult = await this.uploadService.upload(
file,
mimeType,
subdirectory,
);
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
try { | |
// Use the upload service to handle the file upload | |
const subdirectory = `projects/${projectId}/images`; | |
const uploadResult = await this.uploadService.upload( | |
file, | |
mimeType, | |
subdirectory, | |
); | |
return this.projectsRepository.save(project); | |
// Update the project with the new URL | |
project.photoUrl = uploadResult.url; | |
this.logger.debug( | |
`Updated photo URL for project ${projectId} to ${uploadResult.url}`, | |
); | |
return this.projectsRepository.save(project); | |
} catch (error) { | |
this.logger.error('Error uploading image:', error); | |
throw new InternalServerErrorException('Failed to upload image:', error); | |
} | |
try { | |
// Validate file type | |
const allowedMimeTypes = ['image/jpeg', 'image/png', 'image/webp', 'image/gif']; | |
if (!allowedMimeTypes.includes(mimeType)) { | |
throw new BadRequestException('Unsupported file type. Only JPEG, PNG, WebP, and GIF are allowed.'); | |
} | |
// Validate file size (e.g., 5MB limit) | |
const maxSize = 5 * 1024 * 1024; // 5MB in bytes | |
if (file.length > maxSize) { | |
throw new BadRequestException('File size exceeds the maximum limit of 5MB.'); | |
} | |
// Use the upload service to handle the file upload | |
const subdirectory = `projects/${projectId}/images`; | |
const uploadResult = await this.uploadService.upload( | |
file, | |
mimeType, | |
subdirectory, | |
); | |
// Update the project with the new URL | |
project.photoUrl = uploadResult.url; | |
this.logger.debug( | |
`Updated photo URL for project ${projectId} to ${uploadResult.url}`, | |
); | |
return this.projectsRepository.save(project); | |
} catch (error) { | |
this.logger.error('Error uploading image:', error); | |
throw new InternalServerErrorException('Failed to upload image:', error); | |
} |
backend/.env.example
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add NODE_ENV=DEV # 'PROD', 'TEST', 'DEV' here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (1)
backend/src/project/project.resolver.ts (1)
91-117
: Consider renaming the service method for consistency.While the resolver method has been renamed from
updateProjectPhotoUrl
toupdateProjectPhoto
to reflect file upload functionality, the service method is still calledupdateProjectPhotoUrl
. This could be confusing.Consider renaming the service method to match:
// Call the service with the extracted buffer and mimetype - return this.projectService.updateProjectPhotoUrl( + return this.projectService.updateProjectPhoto( userId, projectId, buffer, mimetype, );
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
backend/.env.example
(1 hunks)backend/package.json
(3 hunks)backend/src/project/project.resolver.ts
(2 hunks)backend/src/project/project.service.ts
(3 hunks)frontend/package.json
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (4)
- backend/.env.example
- frontend/package.json
- backend/package.json
- backend/src/project/project.service.ts
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Install and Build Frontend
🔇 Additional comments (3)
backend/src/project/project.resolver.ts (3)
18-18
: Good addition of the new input type.The addition of the
UpdateProjectPhotoInput
import aligns well with the resolver method changes to support file uploads.
91-96
: Good security practice adding ProjectGuard.The addition of
@UseGuards(ProjectGuard)
ensures proper authorization before allowing project photo updates. The method signature change fromupdateProjectPhotoUrl
toupdateProjectPhoto
better reflects the new functionality that handles direct file uploads rather than just URLs.
97-98
: Good destructuring and logging.Properly destructuring the input and logging the operation provides clarity and helps with debugging.
// Extract the file data | ||
const { createReadStream, mimetype } = await file; | ||
|
||
// Buffer the file content | ||
const chunks = []; | ||
for await (const chunk of createReadStream()) { | ||
chunks.push(chunk); | ||
} | ||
const buffer = Buffer.concat(chunks); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider adding file size validation and error handling.
While the file buffering implementation is correct, there are some potential improvements for robustness:
- No file size validation is present, which could lead to memory issues with large files
- There's no error handling if the file stream fails
Consider adding the following improvements:
// Extract the file data
const { createReadStream, mimetype } = await file;
// Buffer the file content
const chunks = [];
+ let totalSize = 0;
+ const MAX_FILE_SIZE = 10 * 1024 * 1024; // 10MB limit
try {
for await (const chunk of createReadStream()) {
+ totalSize += chunk.length;
+ if (totalSize > MAX_FILE_SIZE) {
+ throw new Error(`File size exceeds the maximum allowed size of ${MAX_FILE_SIZE / (1024 * 1024)}MB`);
+ }
chunks.push(chunk);
}
const buffer = Buffer.concat(chunks);
+ } catch (error) {
+ this.logger.error(`Error processing file upload: ${error.message}`);
+ throw new Error(`Failed to process file upload: ${error.message}`);
+ }
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
// Extract the file data | |
const { createReadStream, mimetype } = await file; | |
// Buffer the file content | |
const chunks = []; | |
for await (const chunk of createReadStream()) { | |
chunks.push(chunk); | |
} | |
const buffer = Buffer.concat(chunks); | |
// Extract the file data | |
const { createReadStream, mimetype } = await file; | |
// Buffer the file content | |
const chunks = []; | |
let totalSize = 0; | |
const MAX_FILE_SIZE = 10 * 1024 * 1024; // 10MB limit | |
try { | |
for await (const chunk of createReadStream()) { | |
totalSize += chunk.length; | |
if (totalSize > MAX_FILE_SIZE) { | |
throw new Error(`File size exceeds the maximum allowed size of ${MAX_FILE_SIZE / (1024 * 1024)}MB`); | |
} | |
chunks.push(chunk); | |
} | |
const buffer = Buffer.concat(chunks); | |
} catch (error) { | |
this.logger.error(`Error processing file upload: ${error.message}`); | |
throw new Error(`Failed to process file upload: ${error.message}`); | |
} |
// Call the service with the extracted buffer and mimetype | ||
return this.projectService.updateProjectPhotoUrl( | ||
userId, | ||
projectId, | ||
photoUrl, | ||
buffer, | ||
mimetype, | ||
); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Verification agent
❓ Verification inconclusive
Consider validating file type for security.
The service now accepts a buffer and mimetype, but there's no validation to ensure the file is an acceptable image type.
Consider adding file type validation before passing to the service:
// Call the service with the extracted buffer and mimetype
+ const ALLOWED_MIMETYPES = ['image/jpeg', 'image/png', 'image/gif', 'image/webp'];
+ if (!ALLOWED_MIMETYPES.includes(mimetype)) {
+ throw new Error(`Unsupported file type: ${mimetype}. Allowed types: ${ALLOWED_MIMETYPES.join(', ')}`);
+ }
return this.projectService.updateProjectPhotoUrl(
userId,
projectId,
buffer,
mimetype,
);
🏁 Script executed:
#!/bin/bash
# Check if there's mimetype validation in the service method
rg -A 10 "updateProjectPhotoUrl.*buffer.*mimetype" --glob "*.ts" backend/src/
Length of output: 79
Security Update Required: Validate Image File Type
I verified that there’s no evidence of file type validation in the updateProjectPhotoUrl
flow. The current implementation passes the buffer and mimetype directly to the service without checking if the mimetype is acceptable. Please consider adding a check against a whitelist of allowed mimetypes. For example:
// Call the service with the extracted buffer and mimetype
+ const ALLOWED_MIMETYPES = ['image/jpeg', 'image/png', 'image/gif', 'image/webp'];
+ if (!ALLOWED_MIMETYPES.includes(mimetype)) {
+ throw new Error(`Unsupported file type: ${mimetype}. Allowed types: ${ALLOWED_MIMETYPES.join(', ')}`);
+ }
return this.projectService.updateProjectPhotoUrl(
userId,
projectId,
buffer,
mimetype,
);
This update will mitigate potential security risks by ensuring that only expected image file types are processed.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
// Call the service with the extracted buffer and mimetype | |
return this.projectService.updateProjectPhotoUrl( | |
userId, | |
projectId, | |
photoUrl, | |
buffer, | |
mimetype, | |
); | |
// Call the service with the extracted buffer and mimetype | |
const ALLOWED_MIMETYPES = ['image/jpeg', 'image/png', 'image/gif', 'image/webp']; | |
if (!ALLOWED_MIMETYPES.includes(mimetype)) { | |
throw new Error(`Unsupported file type: ${mimetype}. Allowed types: ${ALLOWED_MIMETYPES.join(', ')}`); | |
} | |
return this.projectService.updateProjectPhotoUrl( | |
userId, | |
projectId, | |
buffer, | |
mimetype, | |
); |
Summary by CodeRabbit
New Features
Bug Fixes