Matt Calkins never disappoints. It continues to be one of the few voices within the technology sector that tries to put a stop to the excesses that other technology companies carry out when it comes to Artificial Intelligence. This year, during the Appian World 2026 celebration being held in Orlando (USA), he coined the term “East Coast AI” as a counterpoint to Silicon Valley AI where it seems that anything goes.

Calkins compared the evolution of AI to the invention of the light bulb, which took 14 years to become truly impactful after overcoming standards battles and technical improvements. The head of the company mentioned a study they have carried out in collaboration with Harvard Business Review, which states that current AI is widely used for personal efficiency and cost cutting, but is rarely used for strategic growth or applications that make critical business decisions.

To solve this, Calkins stressed that the model of Silicon Valley firms is not adequate since it does not prioritize responsibility and rigor. To explain his point of view, Calkins stressed “AI is probabilistic so it requires a deterministic, process-based framework to be reliable. In this model, the process acts as a “reliability machine” that provides the AI ​​with the rails, data and integration necessary to work as a team. As an example of this success, he mentioned the Appian DocCenter solution, a solution that achieves 99% accuracy in document processing, far exceeding the industry standard of 60%.

Towards a precision of “nines”: specification-driven development

A key point of the presentation at Appian World 2026 was the introduction of the term “spec-driven development”, something like development driven by specifications, compared to the popular “vibe coding”. Calkins explained that, although programming based on sensations or natural language is acceptable for low-criticality applications, it is not sufficient for sectors such as pharmaceuticals, finance or NASA space missions, which require multiple “nines” of reliability (99.99%).

To reach these levels, Appian presented the new version of composera tool that uses natural language to generate applications, but allowing the user audit and validate every rule, interface, and data join before the code is finally written. As he explained, “Composer has three use cases. It can be used to build new applications. This is now our main creation method. Second, it can be used to migrate legacy applications to the Appian platform. And third, it can be used to continually improve existing applications.

Matt Calkins, CEO of Appian, at one point during his speech

That is, Appian has made natural language its new development standard, leaving the old formulas behind. This means that if you used to build application objects one by one, now you add a word and Composer creates all the objects you need and brings them together into a functional Appian application, which means power and high reliability in nines, data fabric, AI with responsible barriers and proven and reliable value.

Furthermore, Calkins made an urgent call for the modernization of legacy systems during his speech at Appian World 2026: “With the emergence of AIs like “Mythos”, capable of hacking old applications, the migration of data silos of more than 20 years to modern and secure platforms has become a national security priority and not just a matter of operational efficiency.”

The safe agents

One of the examples of Appian’s way of dealing with AI with respect to the aggressiveness of Silicon Valley is seen in the case of agents. Appian’s commitment is to make its agents more intelligent, safer and more effective and to achieve this they ensure that they have a better structure, context and protection barriers.

In this sense, Appian is improving interoperability across its AI ecosystem. By adopting powerful standards such as the Model Context Protocol (MCP), Appian agents will be able to securely interface with external enterprise systems. Third-party AI agents will have access to powerful Appian tools such as the data fabric, which uniquely provides unified read and write access to enterprise data.

As seen at Appian World 2026, Appian is also advancing agent learning by allowing users to track agent performance and then apply an agent’s memory across processes to improve decision making. This way, users will be able to expand on this by giving the AI ​​guidance on which goals to optimize and recommending improvements that can be safely applied.

Additionally, Appian’s data fabric, one of its flagship products, has been enhanced to provide a unified metadata model that gives agents clearer context about how information is structured and connected across systems.

Continuing its commitment to supporting industry-leading data platforms, Appian has announced at Appian World 2026 a partnership with Snowflake. This unites Appian as an AI orchestration layer with Snowflake’s AI Data Cloud, combining data aggregation, model training and process orchestration to enable immediate business value. MCP-enabled direct integration between the Appian data fabric and Snowflake equips agents with deep business context and allows them to interact directly with Snowflake Cortex AI to drive intelligent, data-backed decisions.