If there is one thing you hear a lot, it is that everyone is contending with way too many apps, databases, and tools, whether in their MarTech stack, sales tech stack, customer experience (CX) stack, HR, finance, and everywhere else. The average large company is knee-deep in cloud apps. Just how many? How about 1,935 within a medium-to-large organization by the end of last year. Compared to only five years before, cloud apps have tripled within the enterprise.
Cloud apps have tripled within enterprise firms in the last five years. Image courtesy McAfee
Apps and databases are everywhere, with one Kleiner Perkins/Netskope study finding nearly 150 within marketing and sales organizations alone, including social media, events, enrichment, sales force automation (SFA), configure-price-quote (CPQ), and collaboration...with a list that goes on and on. Connecting it all is foundational. Gartner notes that integration can amount to 18% of implementation effort, and “by 2023, at least 75% of organizations that did not factor integrability into their business application evaluation criteria will incur inefficient processes and higher overall costs.”
But wait. Integration and automation tools were meant to save us from all this! They would sync the apps, orchestrate manual processes, map and transform the fields, connect REST and SOAP APIs, and load the data warehouses for analytics. There are certainly more than enough tools to choose from, depending on what you want to do. Here is an abridged list of those tools. Ready for acronym overload? Integration Platform as a Service (iPaaS), Extract Transform and Load (ETL), Enterprise Service Bus (ESB), Data Integration (DI), Business Process Management Systems (BPMS), Robotic Process Automation (RPA), Data Prep, API Management, and Low-Code Application Platforms (LCAP). And of course, if those tools did not work out, you could always resort to everyone’s favorite way to spend an afternoon, manually wrangling with data in a spreadsheet.
But what if the very tools we expected to take care of everything became yet another sprawl? And they ended up proliferating too? Welcome to the end of the second decade of the 21st century, where we find ourselves in that very situation. Not only do we have an app, database, and tools explosion, but now we’ve even got a growing variety of different tools and technologies meant to integrate them all too, all aimed at different users, from technical to business folks.
How did we get here? And more importantly, is there any hope that we can simplify things to make them workable?
To get some answers, we need to go back in time. Early on, there was ETL. Back in the 1990s, ETL tools were all the rage to build and maintain data warehouses. These tools were notoriously complicated, installed on-premise, and designed for highly technical IT folks. ETL tools represented hardcore data integration for the few, not the many, back when companies measured projects in years, not months. Back then, business users had to wait on IT for even a few small changes, such as adding a new data field. (Wait, that last part is still happening today!)
At around the same time, we started to see application integration tools, which were called ESBs at the time. The purpose of ESBs was real-time integration, message orchestration, and routing. These highly technical tools were beyond the grasp of business folks. You’d use an ESB for a few “important” integrations deemed worthy of such a substantial investment.
All of this was integration and automation for a handful of expensive consultants to apply to lengthy IT projects that were few and far between.
With the cloud going mainstream, we saw the founding of new data and application integration companies, often by the same teams who worked on the on-premise originals. Except now, the idea was to put all these new tools in the cloud.
First came iPaaS tools, which were born back when new “X-as-a-Service” acronyms were all the rage. They were certainly a little easier for integration, but still, no one in marketing or sales was ever going to touch them—which was never the intention, anyway. iPaaS software was, and remains, extraordinarily complicated and requires a small army of developers to use properly. And back then, no one could foresee just how many apps and databases would be in play just a decade later. As a result, the designers of iPaaS never built any architecture to let anyone quickly connect to any API. Worse, they still relied on legacy business models that priced additional connectivity at a premium. There was no foresight into a future in which teams wouldn’t be content to choose just a handful of connectors to purchase a la carte. The foundation of iPaaS completely missed today’s need for business teams to connect their entire, ever-changing tech stack, and to do it now, not 6-12 months down the road when IT could eventually get to it.
At the same time, many of the original architects of ETL tools began building DI tools in the cloud. DI tools are great at integrating data to load data warehouses and are simpler to use. However, DI tools were not designed for real-time application integration.
Just a few years later, we began to see the rise of application programming interfaces (APIs). APIs are the “container” level at which different software applications communicate. They are nothing new, but over time, we began to see everyone looking to make applications API-enabled, to create API abstraction layers over existing APIs, and to connect APIs. The API management platform was born. However, to get the most out of such tools, you needed (and still need ) a BS in Computer Science. API management platforms were slow to develop and represented hardcore integration for the few (again, not for the many).
Within the last decade, it was fast becoming clear that teams in marketing, sales, service, and operations needed integration tools they could use by themselves, without the complexity of previous generations. We saw the emergence of simple application integration tools like Zapier, which can build point-to-point integrations. Such lightweight tools are useful for a quick-and-dirty integration, but the ease of use came at a hefty price. Every successful team that uses such lightweight tools eventually finds themselves outgrowing them as they require more-sophisticated business logic and scalability. Teams all across organizations need more integration and automation power for strategic projects such as lead management, deal desk approvals, and multi-channel personalization.
The simplicity of point-to-point tools also meant that technical teams and IT wouldn’t embrace them, which mostly left business teams out in the cold in terms of comprehensive support or scalability. These tools seemed to offer a promising start to the decade, but still essentially led to a dead end.
Other tools emerged, like Robotic Process Automation (RPA), which enables teams to automate repetitive tasks. However, RPA mostly uses “bots” that perform screen scraping, run macros, and auto-repeat keystrokes. RPA was great for automating technical debt, especially for older applications that often lacked APIs. However, RPA tools are ill-suited for orchestrating a modern tech stack of apps that are rich in REST and SOAP APIs.
One of the more promising new developments in business technology has been Low-Code Application Platforms. These tools provide a way for power users to develop their own automated processes and apps visually. But while suited for real-time application integration, LCAP is not designed for bulk data movement for data integration - the kind of thing that’s often essential to creating cloud data warehouses and supporting modern analytics.
If you look inside a medium-sized-or-larger company, you will see where this journey led. Often, dozens of integration tools get buried in line-of-business teams and IT, each for different use cases, for specific kinds of users, some new, others old.
There is no single platform that everyone can huddle around—leading to balkanization between integrations and automated processes by different tools. Integration siloes often lead to frustrated users as the backlog builds. As a result, it takes months to develop or change business processes, with process breakdowns that occur depending on which tools, which skillsets, and which technical users your team relies on to bring about change. With mission-critical processes and tools divided into siloes, creating an automated enterprise that can execute a coordinated vision around digital transformation becomes impossible.
The design of many tools also preceded some of the most significant changes in the cloud itself. Modern business applications, from marketing automation, sales, enterprise resource planning (ERP), and customer success have become event-driven, with standards like webhooks quickly becoming ubiquitous. As a result, there was finally the possibility of an automation architecture you could design from the ground up to be instantly responsive to business events, without polling, scheduling, or proprietary connections. Event-driven business applications opened up an entirely new category of automated processes, like dynamic lead management and orchestrating personalized customer outreach based on real-time changes in behavior.
However, fully taking advantage of such a crucial change posed a challenge, because becoming event-centric means data can suddenly increase to 10x or 100x their usual volumes in an instant. Sadly, first-generation SaaS architectures are ill-suited to cope with enormous data spikes.
When we set out to build a General Automation Platform back in 2014, the goal was to start fresh, with a clean sheet of paper. We threw out the traditional preconceptions that a tool should be limited to specific use cases, such as real-time application integration or big data integration. We also rejected the idea of designing anything business teams can use by trading ease-of-use for limitations of inflexibility or lack of scalability. Our vision was about building something that works for everyone, from growth marketing and the deal desk, to revenue operations, business technologists, all the way up to the office of the CIO, for any use case: a General Automation Platform.
Going back to the drawing board and sketching out what it would look like, we settled on ten basic tenets for one platform that could finally fit everyone’s needs.
So, there you have it, the ultimate reason we created a GAP—not an iPaaS, ESB, ETL, or any other of those well-trodden terms that carry so much historical baggage. Instead, we built something new for the way we work today. It’s why enterprises like Arrow Electronics, FICO, Intercom, and hundreds of rapidly-growing companies are using GAPs as their integration and automation Swiss Army knife.
Just how flexible is a GAP like the Tray Platform? Consider cloud computing leader DigitalOcean. The company runs a host of use cases for customer segmentation; real-time reporting; geolocation data enrichment; survey/NPS management; customer activity monitoring; customer-360 data aggregation and insights; sales stage change alerting; and personalized marketing. Each of these use cases is extremely valuable. Without a General Automation Platform, each of these use cases would have required the company to commission and manually connect multiple tools with time-consuming (and expensive) hand-coding.
We are talking about the nexus of process automation, data integration, scalability, usability, and flexibility. A General Automation Platform acts as a shared tool between line-of-business roles and technologists. David Dorman, Head of Growth Marketing at DigitalOcean sums it up: “We can look across the end-to-end stack and connect any of our tools to gain total control. This is the next wave of what a stack will look like and the Tray Platform lets us fulfill that vision.”
To learn more about GAP’s, why they’re different, and what they can do for your company, check out The Beginner’s Guide to General Automation Platforms. To see a GAP in action, sign up for a weekly group demo.