a
Amazon Web Services – CodeCatalyst

Amazon Web Services – CodeCatalyst

When a development team is building out an application, it helps to have access to the same resources, have the tools for planning and testing, and to have access to the application all in one place. CodeCatalyst comes with a slew of continuous integration/continuous development (CI/CD) tools and can leverage other AWS services and be connected to other AWS projects on an account. As a collaborative tool, it is easy to introduce new members into the project and to log all activity or all tests from a single dashboard. It’s a complete package of all the tools needed to securely work on every step of an application’s lifecycle.

What is on Offer with CodeCatalyst?

One of the tools more directly tied to helping users build out their applications on Amazon’s Web Services is a multi-function platform that is easy to use, easy to start with and makes sharing projects on the platform easy.  All of the features described are tailored toward streamlining the application development pipeline by removing as many opportunities for pitfalls as possible.

Templates and Blueprints

For whatever language the dev team decides to build the application on, there are various prebuilt foundations with established resources ready to simplify the initial construction process.  This extends beyond just providing the basics for coding.  It includes a source code repository, an issue tracker, and a collection of resources and integrated tools to boot including the CI/CD pipeline and AWS hosting resources.  Upon starting a project, users will be greeted by a library of options with names and a brief description of their functionality, version number, when they were last updated, and some tags to help with searching.

 

Developer Environments

One major problem in developer toolkits is situations where one member can have a slightly different toolchain or library compared to other members of the team, potentially resulting in bugs that remain undetected and unresolved.  Developer Environments remove any unneeded variance and that all team members involved getting the same setup, ensuring that all experiences that are wanted or not are repeatable.  All configurations are kept in a single file in the source code repository with options to scale instances up to 2, 4, 8, and 16 virtual CPUs.  The file defines all resources for a project including coding, testing, and debugging.  These environments can be paused, restarted, and deleted at a moment’s notice, helping to cut back on overhead.

 

Pipelines

Application builds are deployed on flexible, managed infrastructure through pre-built pipelines that were established by the initial blueprint and can use either on-demand or already provided builds.  They work on a variety of machine sizes and can work with whatever environment the user decides to bring on board.  Configuration is done through YAML files or through a visual editor and the pipeline can be fully automated through GitHub Actions.  More importantly, the pipelines here are designed by default to work with anything in-house at Amazon’s Web Services from the Elastic Container Service (ECS) to the Elastic Compute Cloud (EC2).  This way, testing and deploying to multiple regions and accounts are further simplified but secure.

 

Collaboration Tools

The most important part of any development cycle is ensuring the bridge between all development teams and members present.  There is no shortage of issues that could arise from the simple notion of not having everyone on the same page.  Upon accepting the email invitation to a project, the developer gets to see the entire picture and skip right to working without needing to fuss over petty items like establishing tools or required libraries.  Once the teammate is brought on, they will see all the projects that are currently in progress, an extensive overview of what has happened, recent workflows, and a board of issues and features that have been resolved or are a work in progress.

One of the tools more directly tied to helping users build out their applications on Amazon’s Web Services is a multi-function platform that is easy to use, easy to start with and makes sharing projects on the platform easy.  All of the features described are tailored toward streamlining the application development pipeline by removing as many opportunities for pitfalls as possible.

 

Templates and Blueprints

For whatever language the dev team decides to build the application on, there are various prebuilt foundations with established resources ready to simplify the initial construction process.  This extends beyond just providing the basics for coding.  It includes a source code repository, an issue tracker, and a collection of resources and integrated tools to boot including the CI/CD pipeline and AWS hosting resources.  Upon starting a project, users will be greeted by a library of options with names and a brief description of their functionality, version number, when they were last updated, and some tags to help with searching.

 

Developer Environments

One major problem in developer toolkits is situations where one member can have a slightly different toolchain or library compared to other members of the team, potentially resulting in bugs that remain undetected and unresolved.  Developer Environments remove any unneeded variance and that all team members involved getting the same setup, ensuring that all experiences that are wanted or not are repeatable.  All configurations are kept in a single file in the source code repository with options to scale instances up to 2, 4, 8, and 16 virtual CPUs.  The file defines all resources for a project including coding, testing, and debugging.  These environments can be paused, restarted, and deleted at a moment’s notice, helping to cut back on overhead.

 

Pipelines

Application builds are deployed on flexible, managed infrastructure through pre-built pipelines that were established by the initial blueprint and can use either on-demand or already provided builds.  They work on a variety of machine sizes and can work with whatever environment the user decides to bring on board.  Configuration is done through YAML files or through a visual editor and the pipeline can be fully automated through GitHub Actions.  More importantly, the pipelines here are designed by default to work with anything in-house at Amazon’s Web Services from the Elastic Container Service (ECS) to the Elastic Compute Cloud (EC2).  This way, testing and deploying to multiple regions and accounts are further simplified but secure.

 

Collaboration Tools

The most important part of any development cycle is ensuring the bridge between all development teams and members present.  There is no shortage of issues that could arise from the simple notion of not having everyone on the same page.  Upon accepting the email invitation to a project, the developer gets to see the entire picture and skip right to working without needing to fuss over petty items like establishing tools or required libraries.  Once the teammate is brought on, they will see all the projects that are currently in progress, an extensive overview of what has happened, recent workflows, and a board of issues and features that have been resolved or are a work in progress.

Expected Pricing

CodeCatalyst is one of the AWS services that does benefit users with the Free Tier.  By default, users will receive monthly 2,000 compute minutes, two virtual CPUs with 4GB and a Linux OS only, 60 dev hours for each CPU, and 16GB of storage per CPU.  Free users will also get 10GB of Source storage, 64GB of aggregate Dev Environment storage, and 10 GB of transfer storage per month for both instances. For more additional details find article about aws free tier.  Anything extra beyond what is initially provided will require an upgrade to the Standard Tier at $4 per month.  Time spent logged on will count as time spent using CodeCatalyst.  Along with more compute time and pre-provisioned resources to use, users will get some additional options for virtual CPUs at 4GB, 8GB, and 16GB with the option for Windows operating systems, more storage, and more compute time.  Any additional resource use will cost extra per hour used, but the Linux vCPUs do get a hefty discount for utilizing the pre-provisioned resources.  The specifics for rates on certain vCPUs and more can all be found on AWS’ own page.

Expected Pricing

CodeCatalyst is one of the AWS services that does benefit users with the Free Tier.  By default, users will receive monthly 2,000 compute minutes, two virtual CPUs with 4GB and a Linux OS only, 60 dev hours for each CPU, and 16GB of storage per CPU.  Free users will also get 10GB of Source storage, 64GB of aggregate Dev Environment storage, and 10 GB of transfer storage per month for both instances.  For more on the free tier, there are additional details found here.  Anything extra beyond what is initially provided will require an upgrade to the Standard Tier at $4 per month.  Time spent logged on will count as time spent using CodeCatalyst.  Along with more compute time and pre-provisioned resources to use, users will get some additional options for virtual CPUs at 4GB, 8GB, and 16GB with the option for Windows operating systems, more storage, and more compute time.  Any additional resource use will cost extra per hour used, but the Linux vCPUs do get a hefty discount for utilizing the pre-provisioned resources.  The specifics for rates on certain vCPUs and more can all be found on AWS’ own page.
Dolan Cleary
Dolan Cleary

I am a recent graduate from the University of Wisconsin - Stout and am now working with AllCode as a web technician. Currently working within the marketing department.

Related Articles

A Comprehensive Look at Cloud Storage Pricing

A Comprehensive Look at Cloud Storage Pricing

Having Cloud Storage helps to synchronize key documents between remote workers and to manage data as needed. Cloud services provide a number of features that let users scale contents as they need to and protect storage contents with. Regardless of platform or device type, contents can be accessed by all users who can share that cloud storage. The vendors that provide cloud storage services each have their own features that make them ideal for specific users.

Amazon Elastic Cloud Computing Pricing Guide

Amazon Elastic Cloud Computing Pricing Guide

Amazon Elastic Cloud Computing is the default option for computing on AWS. Outside of outsourced cloud computing options, it is the default service for building, running, and scaling AWS-based applications. As such, EC2 will likely be the main driving force behind AWS bills. Understanding how to control said costs is therefore the most important factor in managing your AWS environment.

Amazon Simple Storage Service Price Guide

Amazon Simple Storage Service Price Guide

AWS pricing is incredibly complex and can result in some users overblowing their budgets very easily. Amazon does have tools for predicting prices and controlling them, though there is a learning curve to it. This is a guide on what controls there are for Amazon Simple Storage Service’s spending.

Download our 10-Step Cloud Migration ChecklistYou'll get direct access to our full-length guide on Google Docs. From here, you will be able to make a copy, download the content, and share it with your team.