CeresTechnologyLogo_WOrange_text

Low Code to No Code with Ceres Nostradamus

Carlos bio image 1Early in my career as a programmer, I had the pleasure of working for Texas Instruments, where I learned many incredible things about computer chip manufacturing. I was part of the team supporting several manufacturing plants worldwide. After years of supporting and developing cool applications (from voice recognition to in-memory databases), I decided to move to the software division where TI envisioned and began the development of game-changing software based on the Software Engineering Application Development Methodology developed by James Martin in the late 1980s. TI built the software by implementing James Martin’s methodology into a computer-aided software engineering (CASE) tool. TI’s CASE tool was called Information Engineering Facility (IEF). For me, IEF is nostalgic and brings great memories.  It reminds me that my former teammates and I had a great time implementing, selling, and competing against other CASE tools.

What does CASE have to do with the topic of this paper?

This is an excellent question. CASE represented the early stages of low-code application development. These tools could construct fully integrated solutions starting with business modeling, then stepping to data modeling, and finally through code generation.  CASE tools required only basic knowledge of a high-level language combined with a degree of logical thinking. Commands entered would be translated into the chosen language at the compiler level.

You did not need to be proficient in languages like COBOL, C, or Java. Even better, you weren’t required to be familiar with all the platforms on which your business might want to deploy the new application. With these tools, it was possible to port applications to various platforms, such as IBM mainframes, HP-UX, SUN, and Tandem NonStop OS, among others. I developed and deployed numerous applications across these platforms for clients despite not being an expert in any one environment.  Importantly, these applications worked the same as any other application and behaved consistently, regardless of the platform.

Eventually, CASE tools fell out of favor due to Enterprise Resource Planning (ERP) software such as SAP and Oracle. Numerous CASE vendors went out of business or were acquired by larger companies. This evolution led me to explore new opportunities.

The CASE tools were the first low-code products in the market and there are many companies still using several varieties of this class of software.

Where did the market go from there?

The very next evolution was the integrated application called enterprise resource planning (ERP). We all have to agree that ERP systems would not be considered low-code, but the theory behind it delivered the benefit of not having to manually code an entire integrated platform. ERP soon became the reigning champion of application software; however, it was not the final chapter.

The world of technology is always evolving, with many innovative tools being developed and new programming languages emerging. Some years ago, during my graduate work, I learned a lot about artificial intelligence (AI) and neural networks. I was pursuing a Ph.D. in computer database design, which was my advisor and mentor’s field of expertise. At that time, there were many challenges with AI, mainly because we lacked the computing power to run complex AI models that could be applicable in the business world. To run such complex models back in the day, you would have needed a supercomputer.  Over time, the evolution of the microchip has allowed the explosion of AI technology. As part of TI’s Semiconductor division, I took advantage of the growth in chip density, and I developed a planning application station for the chip manufacturing plants, which was a somewhat complex AI solution impacting many aspects of the chip manufacturing process, including lot size, wafer diameter, types of machines available, and the duration required for each phase of production. This was a nice application that ran locally on a high-end workstation, but it required high amounts of processing power, memory, and time to process.

Today’s Artificial Intelligence

Today’s artificial intelligence (AI) leverages many different types of data sets – both linear (structured) and non-linear (non-structured) to find simple and complex relationships across these different data sets.  The biggest topic in technology worldwide, including in many board rooms and governments, is Generative AI. There are several tools available to train and utilize large language models (LLMs) in both low-code and no-code ways. ChatGPT, for example, offers its own APIs, where you simply send a request and receive an answer, enabling people to easily integrate this new technology into countless applications. Several alternatives also allow users to train LLMs through a simple API request or a dashboard, thus enabling the setup of your own LLM without writing a single line of code. That being said, there are numerous concerns with the explosive growth of these models, which is a topic for another article.

Having developed and worked with the CASE tools mentioned earlier, I then became interested in exploring how I could address business challenges using low-code and no-code methods to enable users to deploy an AI platform quickly and effortlessly.

Establish the Vision & Create the Design

After joining Ceres, I spent time understanding what our goals were and where we needed to develop to achieve them.  Next, I developed a product roadmap where the target objective was an easy-to-use AI platform for supply chains.  To achieve this objective, Nostradamus would be developed to predict upstream disruptions and risks months ahead.  To ensure it would work well across industries and geographies, we would need to include not only internal factors but also important external ones (such as geopolitical, economic, weather, and many others).   Furthermore, we wanted to create a platform that would fully leverage the power of AI to benefit a wide range of personas in the supply chain, recognizing that the supply chain impacts many different departments and levels.

 

Figure 1 no code low code

Figure 1: Illustrates the simplicity (yet robustness) of Nostradamus.

 

Build a Strong Framework with Robust Connectivity

We decided to build Nostradamus as a software-as-a-service platform requiring “no-code” where feasible and low-code where not. As we saw it, data integration was key.  Data transmission needed to be as easy and straightforward as possible. To accommodate non-technical users, we first enabled accepting historical transaction data sets in the most common formats, namely Excel and CSV files. This would allow any user to just “drag and drop” (import) the data into the platform. As with any AI platform, historical transactions are needed to train the AI engine. The AI model cannot perform the necessary analysis without this step.

The next step was to build an easy-to-use interface along with a means to load the data using REST APIs. It was here where we truly achieved “no-code/low-code” by selecting Boomi, the leader in iPaaS. The capability to transact (import/export) with Boomi allows users to import historical transactions (used to train the model) and in-flight transactions (used to generate predictions) from any system (e.g., SAP, Oracle, etc.) into Nostradamus for processing.

 

Picture 2 no code low code

Figure 2: Illustrates the many ways a user can transmit data.

 

Engineer Strong Performance and Add Advanced Features

Highly technical and analytical individuals, such as data scientists, naturally have high requirements and demand configurability because they desire, as expected, a robust ability to configure AI models to optimally support their businesses. This was very similar to what I experienced with CASE tools. Some technical experts were not happy with the inability to make desired changes to the generated code (along similar lines, have you ever worked with a lawyer who accepted a contract as written? The answer is no, right?!).

Internally, we set our goal to achieve a precision of at least 80% and a recall of at least 50%, focusing on higher precision as the default. These standard settings, for key metrics important for an AI model, represent a balanced approach that mitigates false positives. Using the combined strengths of our in-house data scientists and our PhD advisors, we then turbocharged the platform to allow advanced technical users to modify these parameters along with the validation data set (validation is key for confirming the AI model’s performance) via both a dashboard (no-code) and our APIs (low-code).

 

Picture 3 no code low code

Figure 3: Advanced options available for sophisticated users.

 

Differentiate, Complement, and Augment

An important distinction must be made. Nostradamus is not a supplier mapping platform, meaning it does not require extensive knowledge of the supplier network. Instead, it learns from historical and current transactional data along with our 25,000+ external data sets (geopolitical risk, economic factors, commodity prices, geopolitical factors, news, satellite imaging, and many more), to predict multi-tier disruptions and assign risk levels. (I will address and explain how Nostradamus can identify multi-tier suppliers without having to wait for data-collection forms to be returned from suppliers in a future article.)

Picture 4a no code low codePicture 4b no code low code

Figure 4: Explains what were the factors that caused delays (explainability is another important factor in understanding the reason and cause of delays).

 

By incorporating these capabilities, we removed the complexity of managing a supply chain AI platform and delivered a comprehensive “no-code/low-code” AI platform.  This enables our clients to achieve faster ROI and enhance resilience in their supply chain networks without needing to send forms to suppliers or engaging in manual supply chain mapping. This task can be time-consuming if you have hundreds or thousands of suppliers, or even if you have only a few. Using our platform, managing any number of suppliers, whether 10 or 100,000, can be accomplished within 2-4 weeks.

Nostradamus delivers on the objectives we set:

  • Develop a robust AI platform
  • Simplify and accelerate deployment via a low-code/no-code platform
  • Enhance features to accommodate many different personas
  • Provide easy data integration (via Boomi Integration or Ceres APIs)
  • Combine external factors (outside of the 4 walls) consisting of 25,000+ external data sets and growing to develop highly accurate results without manually intensive and time-consuming data mapping exercises

Nostradamus is truly a unique platform, and it is a no-code/low-code platform.

If you would like to reach out to me for an in-depth discussion of our deployment strategy, feel free to connect with me via LinkedIn.