When your dev team tells you they are planning to move one of your key applications to the cloud to become a SaaS service, usually one of two things will happen:
- First, they will attempt to recreate the exact functionality with the same architecture running on rented computers in the cloud. Advice: Run away, the project is doomed.
- Second, they will reimagine and re-engineer that application to take advantage of some of the fundamental benefits of cloud computing; in other words, the cloud-native approach.
While it is possible to port traditional applications and data operations directly to the cloud, many SaaS vendors have gained first-mover benefits from embracing a cloud-first, cloud-native architecture. In this edition of eWEEK Data Points, George Demarest, Senior Director of Marketing at Kyligence, explains why he believes the cloud-native approach will win out over a direct software port to the cloud.
Data Point No. 1: Elasticity provides a cost advantage
It is almost never a good idea to attempt to match your physical infrastructure specs to cloud virtual infrastructure. Even though Amazon, Azure, and Google clouds provide virtual machines of all sizes that more or less match the specs of their physical counterparts, you’ll almost never be getting the best deal for your cloud spend. These platforms provide flexible pricing options like “reserved instances,” enterprise agreements, and savings plans. Your cloud admin in IT can guide you.
Data Point No. 2: Separation of computing and storage enhances both
In your data center, the servers you purchase typically have some direct-attached storage (DAS) that you can use to store temporary files, images, documents or whatever. But, when you enter the SaaS arena in the cloud, it is dangerous to rely on this model because your computer/CPU needs may rise and fall very differently than your data storage needs. The cloud enables you to use object storage services such as AWS S3 or ADLS, which can be purchased, optimized and managed separately from your computing requirements. This separation of computing and storage will aid you in avoiding a “success crisis”–like adding 10,000 new users.
Data Point No. 3: Scaling reads and writes separately scales both
Similarly, when you are deploying data-rich SaaS services to a potentially huge concurrent user base, you may want to choose the best data discovery, data manipulation and data retrieval technologies. In the past, relational databases might have been the logical choice for these functions, but at cloud-scale data volumes and users, it may make sense to pick more specialized cloud services, like columnar storage, in-memory databases, or data streaming. This way, if a majority of your workloads are read-intensive and your database writes are bursty or intermittent, then your normal SaaS operations continue, even when writes may spike (say, at the end of a quarter or year). This can provide a better user experience and a more resilient operating model.
Data Point No. 4: Design for cloud object storage
Going further with the issue of storage, a cloud-first design decision is to focus on the advantages provided with cloud storage services, such as S3 or ADLS. Cloud providers will be under competitive pressure to improve and innovate within their storage services. Application architects who closely track and quickly adapt to these innovations will have every advantage over competitors who are more circumspect. Take, for example, Amazon’s recent addition of read-after-write consistency. Having this feature built into the storage may mean that paying for some sort of SQL query engine may not be necessary for some use cases. Other areas that could benefit from this competitive innovation are security, encryption, compression, or other cost-saving measures.
Data Point No. 5: Make it foolproof
A clear advantage to those companies that embrace a cloud-native approach is the mindset of immediacy, automation, and simplification (nothing’s ever simple). SaaS providers often can live or die by whether they can provide instant provisioning, set-it-and-forget-it configuration, and a “push-button” user experience for even complex IT or business functions. The other side being foolproof is to enable users to be more productive through increased automation, built-in predictive intelligence, or machine learning that can ensure that your environment is running optimally. SaaS companies must be adept at creating foolproof workflows and increasing the productivity and effectiveness of users.
Data Point No. 6: Provides a plausible exit strategy
While each cloud provider features proprietary cloud services (data warehouses, ETL, messaging, storage), they also provide a rich set of ready-to-go open source technologies like Spark, Kafka, Flink, MySQL, Postgres and many others. While it goes too far to say that using these open source offerings makes it easy to move from one cloud to another, it does mean that a migration might not be a total rewrite if a switch in cloud providers is in the cards. More to the point, many IT architects are looking down the road towards a multi-cloud model as many companies are already dealing with two or more cloud providers. If your organization can expertly exploit cloud services from different vendors, then being able to emphasize one cloud over another is the first step in future-proofing your solution.
For SaaS vendors to succeed in a crowded marketplace, they need to start ahead of the game by imagining their services as the perfect microcosm of the cloud: elastic, innovative, resilient, and hopefully cost-effective.
If you have a suggestion for an eWEEK Data Points article, email cpreimesberger@eweek.com.