Menu Contact us
Use Cases
Benefits
Technical Details
Case Studies & FAQ
Strato Earth
Contact us today
Strato Earth logo
Use Cases
Benefits
Technical Details
Case Studies & FAQ
Strato Earth
Contact us today
info@strato.earth (copied to clipboard)
Strato Workflows
A web platform for building cloud-based data processing workflows for satellite imagery, LiDAR and more.
Drag & drop your Dockerized algorithms to create automated, serverless workflows. Trigger workflows manually or via a RESTful API request.
Hosted in your AWS account.
Icon 1 Hosted in your organization’s Amazon Web Services and Github accounts. Optimized cloud resource usage with no additional cloud fees added on top of your AWS bill.
Icon 1 Save thousands of engineering hours. Automate months ahead of schedule. Security, scaling and cloud resource provisioning handled automatically and transparently.
Icon 1 No vendor lock: your algorithms are saved in Github and ready to migrate. Your workflows are exportable as Amazon States Language for AWS Step Functions.
Use Cases Icon 1
Benefits Icon 1
Technical Details Icon 1
Case Studies & FAQ Icon 1
vector background
click image to expand
Execute workflow
Who is Strato Workflows for?
Strato Workflows is for any organization that has developed its own data processing algorithms, and wishes to combine them into automated workflows which run in the cloud. Users typically process satellite imagery and LiDAR data, but Strato Workflows can be used with virtually any kind of input.
Example use case 1:

A precision agriculture startup has developed a series of algorithms in R which it combines in various configurations to consume and process satellite imagery and soil sample datasets. The company is building its own web-based SaaS user interface, and needs to automate its data processing algorithms in the cloud so that they can be triggered from the web application. The company cannot expose its IP (algorithms) outside of its own cloud account, and needs to launch quickly and without spending millions of dollars on DevOps and data engineering.

Example use case 2:

An engineering firm prepares for its infrastructure projects by creating 3D visualizations and geomatic analytics of project sites based on LiDAR datasets. The firm has developed its own range of python algorithms for processing the LiDAR, and is now looking for a scalable, rapid and cost-effective way to automate these LiDAR processing workflows in the cloud.

Solution:

Both organizations can use the Strato Workflows Github integration and drag & drop dashboard to Dockerize and transform their data processing algorithms into automated cloud-based workflows in a matter of days, and without exposing their IP or data. The workflows can be triggered via a RESTful API request (such as from the precision agriculture firm's custom UI), or manually via the Strato Workflows dashboard. The outputs of the workflows can be automatically placed in a designated AWS S3 bucket or written to a database.

click image to expand
Strato Workflows
Why use Strato Workflows?

Reduced complexity & engineering hours: The Github integration and drag & drop dashboard allows the data scientists who wrote and understand the algorithms to build and maintain the automated workflows themselves. This reduces the lenghty, error prone processes of communication and verification between data science and DevOps/data engineering teams.

Speed: Because Strato Workflows provides industry-grade cloud infrastructure and CI/CD out of the box, and allows data scientists to build workflows with a simple drag & drop interface, companies are able to acheive automation in days and weeks instead of months and years.

Best practices & no vendor lock: Strato Workflows enforces software development best practices, such as version control (Github) and containerization (Docker). Workflows themselves are exportable as Amazon States Language JSON files, and can be re-imported for use directly in the AWS Step Functions console.

Security & IP protection: Strato Workflows is launched in your organization's own AWS account (to protect your IP), with expert security configuration implemented via AWS services such as Cognito, VPC, IAM and Systems Manager.

Reduced cloud fees: No additional cloud fees are added on top of your organization's AWS bill, and moreover, Strato Workflows helps to decrease and monitor your cloud fees via use of serverless architectures, billing alarms, and automated compute pricing estimates.

click image to expand
How it works
Strato workflows consists of a dashboard (web-based graphical user interface) and Github integration. We launch and configure Strato Workflows in your Amazon Web Services account free of charge.
1
Create a Github repo
Run our bash script in your terminal to spin up a new repository in your organization’s Github account.
2
Add your algorithm
Add a data processing algorithm in any language to the repo and merge to main. Our secure Github integration pushes a Docker image of the repo to your AWS account.
3
Build serverless compute tasks
Use the Workflows user interface (hosted in your AWS account) to turn your algorithms into serverless compute tasks with a few clicks of the mouse.
4
Build a workflow
Drag-and-drop to orchestrate the algorithms into serverless data processing workflows.
5
Process your data
Execute the workflows manually, on a schedule, or via a RESTful API request.
click image to expand
Strato Workflows Github Integration
Github Integration
The Strato Workflows Github Integration creates Docker images of your organization's Github repositories, and pushes the images to your AWS account.
An organization's code is never exposed outside of its own Github and AWS Accounts.

The Github integration is a CI/CD script which runs in the Github Actions of your organization's selected Github repositories. We provide a script for users to run in their machine's terminal. The script will add Strato Earth's CI/CD script to an existing repository, or spin up an entirely new repository in your organization's Github account.

The repository created by the script includes a Dockerfile boilerplate and Github Actions YAML file. Users may add data processing algorithms written in any language to the repository.

When a commit is made to a desgnated branch, the repository's Github Actions are triggered, generating and pushing an updated Docker image to the Elastic Container Registry of the organization's AWS account. The Github Actions typically take only 1-2 minutes to run, allowing users to iterate rapidly and quickly see the results of code change in workflows.

The Github integration itself is an industry-grade CI/CD system. Beyond creating updated Docker images of your Github repository, it runs static code and security analysis.

click image to expand
Strato Workflows dashboard
Dashboard
The Strato Workflows dashboard, for building and executing workflows, is a secure, web-based drag & drop user interface hosted in your own AWS account.

In the dashboard, users can build serverless compute tasks out of their Docker images and drag, drop and connect them together to create workflows.

The workflows can be executed in the dashboard, where users can track execution progress in real time, and inspect logs and outputs.

Create as many dashboard user accounts as necessary for your team.

click image to expand
Strato Workflows select compute capacity
AWS Step Functions & Serverless Compute
Strato Workflows is a built on a combination of AWS services, including Step Functions, ECS Fargate, S3, EFS, RDS Aurora, Cloudwatch and more.
The preconfiguration of secure, cost-efficient cloud resources, combined with the Github integration and drag & drop dashboard, eliminates thousands of engineering hours.

After the Github Integration pushes a Docker Image of your repository to AWS ECR, the image is available in the dashboard. The image can be configured to run as a serverless compute instance via AWS ECS Fargate. Simply select the desired amount of RAM and CPU, and an ECS Fargate task is generated within a minute.

Strato Workflow's serverless architecture means that compute cloud fees are only incurred when workflows are triggered to run.

The ECS Fargate task can then be added as a state in a workflow. The internal workflow engine is AWS Step Functions. As the user positions and connects different states in the dashboard, complex Amazon States Language (ASL) code is generated under the hood, and the underlying AWS Step Functions state machine is automatically updated.

Users have access to the raw ASL generated by Strato Workflows. Workflows ASL can be exported, stored in version control such as Github, and even imported directly to AWS Step Functions, which means there is no "vendor lock" for Strato Workflows users.

click image to expand
Strato Workflows visualize point cloud
Input, Output & Preview Data
As a workflow runs, data is passed between algorithms (workflow states) via designated paths in AWS EFS or S3.
An EFS drive and S3 bucket are automatically connected to all workflows, with cost and security optimizations pre-configured.

A set of simple environment variables are automatically passed to the Docker containers running in ECS Fargate, which indicate the correct EFS path or S3 bucket and key for the inputs and outputs of each workflow state (i.e., ECS Fargate task).

Output data may be previewed directly in the dashboard, or downloaded with a click.

click image to expand
Strato Workflow logs and performance
Logs, Cost & Performance Monitoring
Logs generated by your serverless compute tasks (ECS Fargate) are viewable in the Strato Workflows user interface during and after workflow executions.
Cost estimates are provided for different RAM/CPU configurations of ECS Fargate tasks, and for each Fargate task execution after a state runs in a workflow.
Cloud fee notifications and billing alarms are provided automatically.
click image to expand
Strato Workflows parallel processing
Processing Big Data
After a Dockerized algorithm is turned into an ECS Fargate serverless compute task, it can be inserted in a "map state" for parallel processing of large datasets. Simply point the map state to the location of the input dataset in EFS or S3, and the algorithm will be run in parallel on each element of the dataset.
click image to expand
left arrow right arrow
Map Visualization
Strato Workflows includes a web map application for sharing and visualizing big data internally with your team, or externally with clients or the public. Your workflows may also be triggered via the web map interface. Hosted in your cloud account and at your domain name.
Learn more about Strato Maps
click image to expand
Strato Workflows documentation
Support & Documentation
Our team will launch and configure Strato Workflows in your Amazon Web Services account free of charge.

Strato Earth offers a dedicated Slack channel for each client in order to promptly resolve any issues or questions that may arise.

Bug fixes are provided free of charge. Custom feature requests, once approved, are billed at a discounted consulting rate, as are consulting services for code refactoring.

Strato Workflows documentation is available at strato.earth/docs.

Icon 1
Frequently Asked Questions
Does Strato Workflows provide data processing algorithms out-of-the-box?
Icon 1 No. While this is potentially a future direction for the product, presently Strato Workflows does not provide pre-built data processing algorithms. Strato Workflows is for organizations that have their own data processing code bases which they wish to automate in the cloud in the most efficient manner possible.
Can Strato Workflows only be used for processing satellite imagery and LiDAR?
Icon 1 No. Most of our users are focused on imagery and LiDAR processing, and we provide some related functionailty for importing and visualizing these data types. However, Strato Workflows can be used to process virtually any kind of data—whatever your Dockerized algorithms are designed to process.
Can Strato Earth help me to refactor my code for use in Strato Workflows?
Icon 1 Yes. We provide consulting services to Strato Workflows users at a discounted rate.
What is Strato Earth's business model?
Icon 1 Strato Workflows is available for a flat monthly fee, with considerable discounts for small businesses. As we are committed to minimizing our user's cloud fees, we do not charge additional usage-based fees on top of your AWS bill.
Icon 1
Case Studies
Icon 1
Resolv by Advanced Remote Sensing Inc.

Project type: Strato Workflows, Strato Maps

Using only scene statistics without ancillary data inputs, ARSI's RESOLV™ software allows small sat companies to achieve near real-time atmospheric correction of satellite imagery, so that the imagery can be applied to time-sensitive use cases.

ARSI needed a cloud-based platform to automate their atmospheric correction process and provide live access to clients. For demo purposes, a web interface was also required so that users could select imagery for atmospheric correction, and download the corrected files.

ARSI was able to automate and provide live access to their Resolv software within a matter of days via the Strato Workflows Github integration and drag-and-drop UI. Strato Workflows was launched in ARSI's own Amazon Web Services account, so that their proprietary algorithm was not exposed outside of their own version control and cloud accounts. As ARSI implements new versions of the Resolv software tailored to the technical specifications of different small sat providers, ARSI scientists can rapidly spin up new cloud-based automated workflows via Strato Workflows in a matter of hours and without hiring a devops team.

Strato Maps, hosted in ARSI's own cloud account and at a domain name of their choosing, was integrated with Strato workflows to provide the demo web interface where users can select images for atmospheric correction via the Resolv software.

Icon 1
High Resolution Inventory Solutions (HRIS) by Tesera Systems

Project type: Strato Workflows

Tesera’s High Resolution Inventory Solutions (HRIS) deploys advanced machine learning and data analytics to produce reliable forestry and natural resource inventories. HRIS is a blend of area-based modelling (using LiDAR, multispectral imagery, ground plot data, complemented with terrain and climate data) and individual tree crown delineation to provide a more reliable, scalable and consistent forest inventory.

Tesera needed to automate their HRIS data processing algorithms to run at scale against big data in the cloud. The solution needed to allow for re-combination of the algorithms into a series of differentiated workflows. Tesera's own forestry experts and data scientists would need to be able to build and iterate on the workflows directly, as only they have the required expertise to combine their algorithms into intricate automated workflows, and evaluate the output datasets.

Tesera opted to use Strato Workflows to achieve automation. Strato Workflows' Github integration allowed for Tesera's existing Docker images to be re-combined into cloud-based workflows with only light modification. Tesera's experts were able to use the Strato Workflows drag-and-drop UI to rapidly combine their algorithms into a series of workflows, with easy configuration of parallel processing nodes so that thousands of input files could be processed concurrently by the same algorithm.

The workflow duplication feature has allowed Tesera data scientists to instantly create duplicates of their complex workflows, which can then be easily modified into variant workflows.

Looking for documentation?
Find it here.