Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add new serverless-framework-sqs-dynamodb kit #709

Open
wants to merge 62 commits into
base: main
Choose a base branch
from

Conversation

dustinsgoodman
Copy link
Contributor

@dustinsgoodman dustinsgoodman commented Dec 23, 2022

Type of change

  • Documentation change
  • Bug fix
  • New kit

Summary of change

Adds a new starter kit to the project that utilizes a fully serverless AWS stack. This kit includes:

  • docker-compose for local resourcing
    • DynamoDB + Admin
    • ElasticMQ
    • Redis
  • mechanisms for database table creation and seeding
  • healthcheck endpoint
  • code formatters
  • data model
  • testing
  • basic API CRUD operations
    • Fetch all
    • Create
    • Fetch one
    • Update
    • Delete
  • documentation tooling
  • CI
  • Type Generation
  • Queue Example
  • Deployment Instructions

Checklist

  • These changes follow the contributing guidelines
  • This starter kit has been approved by the maintainers
  • Changes for this new starter kit are being pushed to an integration branch instead of main
  • All dependencies are version locked
  • [ ] This fix resolves #
  • [ ] I have verified the fix works and introduces no further errors

@dustinsgoodman dustinsgoodman added WIP Work in Progress DO NOT MERGE labels Dec 23, 2022
@netlify
Copy link

netlify bot commented Dec 23, 2022

Deploy Preview for starter-dev canceled.

Name Link
🔨 Latest commit fea8909
🔍 Latest deploy log https://app.netlify.com/sites/starter-dev/deploys/63c80072afd6a2000914aeb6

@aws-amplify-us-east-1
Copy link

This pull request is automatically being deployed by Amplify Hosting (learn more).

Access this pull request here: https://pr-709.d2zmkukv1jdnvq.amplifyapp.com

@dustinsgoodman dustinsgoodman removed WIP Work in Progress DO NOT MERGE labels Jan 1, 2023
@dustinsgoodman dustinsgoodman marked this pull request as ready for review January 1, 2023 20:41
mbicknese
mbicknese previously approved these changes Jan 2, 2023
Copy link
Contributor

@mbicknese mbicknese left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome to have this as a starter! Left some food for thought, but no show-stoppers.

# configures DynamoDB resource for persistent storage
dynamodb:
image: amazon/dynamodb-local
hostname: dynamodb
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small remark, this attribute seems redundant as the service name is used as hostname by default. It stood out to me as the other services don't have it specified.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is, but isn't. If someone were to change the docker name, which they totally can, but this attribute allows me to reference it later in the doc for the db-admin feature so it's necessary.

"name": "serverless-framework-sqs-dynamodb",
"version": "1.0.0",
"description": "A starter kit utilizing the Serverless Framework in conjunction with AWS DynamoDB & SQS.",
"author": "Dustin Goodman",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe you don't want to put yourself up as "THE" author.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well I am THE author of this kit 😉 - I do want to start introducing authors to kits though for better attribution to make this project more open source friendly and crediting.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@TapaiBalazs added some good context. I was under the impression that we didn't have it on the other kits, due to it being kits of us all. But maybe I'm just a bit too much of a communist 😉

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just read more on the official docs on this: https://docs.npmjs.com/cli/v9/configuring-npm/package-json#people-fields-author-contributors

Looks like we need author & contributor fields. IMO the person who defined the kit elements is the author and everyone who helped make it what it is are the contributors.

Copy link
Contributor

@TapaiBalazs TapaiBalazs Jan 13, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand your point, but the generator script currently does the following:

  1. User runs the generator
  2. User selects which kit they want, and then provide a name
  3. the script downloads the folder of the starter kit from github
  4. the package.json "name" and "version" properties get overwritten

Therefore if Jane Doe generates this starter kit, their package.json will contain you as the author in their project. This is the reason why other kits does not have the author field in the package.json file.

Of course, the generator script can be updated to remove the author from the package.json, we could even add a third prompt, so the user could set their own author field during script run.

Update: we could update the script to get the user name and e-mail from their git configuration and set them as the author field.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmmm... this raises a good discussion point. Something I've been pondering is if we put a "starter-dev.json" or equivalent file in each kit that we use instead of the package.json for the CLI. In general, here are the problems I'm thinking about and wanting to solve:

  1. Are contributors properly given credit for their submissions and work regardless of the initiation point?
  2. Can we support more non-Node based starter kits? Right now the Deno kit has a bit of a hack to be included but what if we started to include non-JavaScript tooling as well? We need to maybe rethink our CLI strategy in this regard.

I'll be honest that I'm less concerned about the generated kit containing the original authors details. I think that's actually appropriate in this case. Renaming was to allow others to have a better project name custom to what they were doing out of the box. I think this other metadata file I'm thinking about is maybe the solve though so maybe this is a moot point.

},
},
'esbuild': {
packager: 'yarn',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suppose there's no simple way to set this to the package manager used to install the deps?

My concern is that not all devs have yarn locally, but all node installations come with npm.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll note this in the README in the serverless section. That's a really good shoutout. I was developing using yarn so I needed it set to this and my instructions recommend it.

export const handler: DynamoDBStreamHandler = async (event) => {
console.log('Example Stream Processor Handler initiated');

const recordHandler = async (record: DynamoDBRecord) => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seeing this method does not rely on the handlers state, we could move it out of the body. It's a micro-optimization, I'm aware, but it might set a good example.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a great recommendation. I concur.

@@ -0,0 +1 @@
export const DEFAULT_CACHE_TIME = 300_000; // 5 minutes = 1000 ms/s * 60 s/min * 5 min
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Underscore thousand separators, nice!

Copy link
Contributor

@TapaiBalazs TapaiBalazs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks alright, but I have some comments and questions

@@ -38,7 +38,10 @@ export { ShareIcon } from './ShareIcon';
export { LinkedinIcon } from './LinkedinIcon';
export { QwikIcon } from './QwikIcon';
export { SolidJsIcon } from './SolidJsIcon';
export { DenoIcon } from './DenoIcon.tsx';
export { DenoIcon } from './DenoIcon';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch


### General Commands

- `build` bundles the project using the serverless packaging serverless. The produced artifacts will ship bundles shipped to AWS on deployment. You can optionally pass `--analyze <function name>` to run the bundle analyzer and visualize the results to understand your handler bundles.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

using the serverless packaging serverless.

This feels a bit strange wording to me.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will ship bundles shipped to AWS on deployment

This too feels a bit strange

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LOL - I was tired when I wrote these. Will fix. Good catch.

- `deploy` ships the project to the configured AWS account using the Serverless Framework CLI command.
- `start` runs the `serverless-offline` provided server for local development and testing. Be sure to have the local docker infrastructure running to emulate the related services.
- `test` runs `jest` under the hood.
- `lint` runs `eslint` under the hood. You can use all the eslint available command line arguments. To lint the entire project, run `yarn lint .`, or equivalent. You can affix `--fix` to auto-correct linting issues that eslint can handle.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can use all the eslint available command line arguments

I think You can use all the available eslint command line arguments would be the word order


This README uses `yarn` for commands. If you're using `npm` or `pnpm`, utilize the equivalent version of the commands.

1. Create a `.env` file:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you need to have a set up, configured and logged in AWS account before these steps? 🤔

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically speaking - no. This kit works FULLY offline and does not require AWS at all. That being said, for deployment and other things, it might be nice later. Will add notes.


### Infrastructure Commands

- `infrastructure:up` creates docker container and related images and runs them in the background. This should only be needed once during initial setup.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on the docker-compose.yml the infrastructure creates 4 containers (dynamodb, dynamodb-admin, sqs, and redis), so in these instructions I think docker containers should be used in the sentences.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LOL - good catch.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, looking at the Docker Desktop dashboard, it's 1 container with 4 images running so I think what I have is correct
image

Copy link
Contributor

@TapaiBalazs TapaiBalazs Jan 16, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't mean to be annoying on this, but that's 4 containers running 4 images. As far as I understand docker: 1 container runs 1 image. On your screenshot you have 4 running containers.

What you call a container with the dropdown that's just UI grouping, if you run docker ps you will see all four of those as separate containers with their separate container ID

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I interpret the screenshot as 1 container, 4 images but I won't admit to being a Docker expert. I will ping a DevOps buddy to confirm.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dustin's friend here. @TapaiBalazs is correct. If you look at the docker-compose file, you'll see that each one of those containers are defined with their own image and configuration.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TIL - thanks @mpstein ❤️

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fix pushed

import { getCacheKey } from './getCacheKey';

export const destroy = async (key: string) => {
const item = await deleteItem(process.env.TECHNOLOGIES_TABLE, { id: key });
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does the deleteItem method return the deleted entry? 🤔

hat's what the unit test validates, but it feels strange for me after postgresql, where delete did not return anything, just information about how many rows were affected and the rest endpoint returns with OK.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My implementation has chosen to return the record. It doesn't necessarily have to do this, but DynamoDB does return it, therefore, I do.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Understood, I just wanted confirmation on the dynamodb implementation.

return newTechnology;
}

return null;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we suppress an error here? If yes, shouldn't we log it? Should this return null ever be reached? 🤔

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No error suppression so this is good as is.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was now able to put together a mental model of the structure and understand that this is not the handler, but a level down.

@@ -0,0 +1,7 @@
/**
* Utility function for checking where functions are being run locally via serverless offline
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* Utility function for checking where functions are being run locally via serverless offline
* Utility function for checking if functions are being run locally via serverless offline


afterAll(async () => {
ddbMock.restore();
await removeFromCache(getCacheKey('87af19b1-aa0d-4178-a30c-2fa8cd1f2cff'));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where do you set/hardcode this cache key when setting the item into the cache? 🤔

I also think that this uuid string should be exported as a MOCK_TECHNOLOGY_ID constant and use across tests.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These were just bad copy/pastes from other tests I wrote. They aren't actually needed.

return responseHelper(StatusCodes.NOT_FOUND, null);
}

return responseHelper(StatusCodes.OK, technology);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this method invalidate the cache entry for the item's id? 🤔

If the GET technology/:id enpoint is cached for 5 minutes, and then an update occurs, the previous value gets returned back in the GET endpoint. I guess same goes for the DELETE and the CREATE methods as well, namely the caching of the all-technologies endpoint should also be invalidated if that is cached. 🤔

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I handle it a layer deeper in the models/technology/update. it is doing the invalidation but the invalidation is handled by the model class that determines whether a value is from cache or database instead of the handler layer. Separation of concerns was my reasoning.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants