What is on Offer with CodeCatalyst?
One of the tools more directly tied to helping users build out their applications on Amazon’s Web Services is a multi-function platform that is easy to use, easy to start with and makes sharing projects on the platform easy. All of the features described are tailored toward streamlining the application development pipeline by removing as many opportunities for pitfalls as possible.
Templates and Blueprints
For whatever language the dev team decides to build the application on, there are various prebuilt foundations with established resources ready to simplify the initial construction process. This extends beyond just providing the basics for coding. It includes a source code repository, an issue tracker, and a collection of resources and integrated tools to boot including the CI/CD pipeline and AWS hosting resources. Upon starting a project, users will be greeted by a library of options with names and a brief description of their functionality, version number, when they were last updated, and some tags to help with searching.
Developer Environments
One major problem in developer toolkits is situations where one member can have a slightly different toolchain or library compared to other members of the team, potentially resulting in bugs that remain undetected and unresolved. Developer Environments remove any unneeded variance and that all team members involved getting the same setup, ensuring that all experiences that are wanted or not are repeatable. All configurations are kept in a single file in the source code repository with options to scale instances up to 2, 4, 8, and 16 virtual CPUs. The file defines all resources for a project including coding, testing, and debugging. These environments can be paused, restarted, and deleted at a moment’s notice, helping to cut back on overhead.
Pipelines
Application builds are deployed on flexible, managed infrastructure through pre-built pipelines that were established by the initial blueprint and can use either on-demand or already provided builds. They work on a variety of machine sizes and can work with whatever environment the user decides to bring on board. Configuration is done through YAML files or through a visual editor and the pipeline can be fully automated through GitHub Actions. More importantly, the pipelines here are designed by default to work with anything in-house at Amazon’s Web Services from the Elastic Container Service (ECS) to the Elastic Compute Cloud (EC2). This way, testing and deploying to multiple regions and accounts are further simplified but secure.
Collaboration Tools
The most important part of any development cycle is ensuring the bridge between all development teams and members present. There is no shortage of issues that could arise from the simple notion of not having everyone on the same page. Upon accepting the email invitation to a project, the developer gets to see the entire picture and skip right to working without needing to fuss over petty items like establishing tools or required libraries. Once the teammate is brought on, they will see all the projects that are currently in progress, an extensive overview of what has happened, recent workflows, and a board of issues and features that have been resolved or are a work in progress.
Expected Pricing
CodeCatalyst is one of the AWS services that does benefit users with the Free Tier. By default, users will receive monthly 2,000 compute minutes, two virtual CPUs with 4GB and a Linux OS only, 60 dev hours for each CPU, and 16GB of storage per CPU. Free users will also get 10GB of Source storage, 64GB of aggregate Dev Environment storage, and 10 GB of transfer storage per month for both instances. For more additional details find article about aws free tier. Anything extra beyond what is initially provided will require an upgrade to the Standard Tier at $4 per month. Time spent logged on will count as time spent using CodeCatalyst. Along with more compute time and pre-provisioned resources to use, users will get some additional options for virtual CPUs at 4GB, 8GB, and 16GB with the option for Windows operating systems, more storage, and more compute time. Any additional resource use will cost extra per hour used, but the Linux vCPUs do get a hefty discount for utilizing the pre-provisioned resources. The specifics for rates on certain vCPUs and more can all be found on AWS’ own page.