Neural networks drive intelligent decision-making in various complex software systems, typically known as smart systems. To streamline their development, model-driven engineering is pivotal in modelling and semi-automatically generating code for these systems.This supposes that neural networks can be modeled as any other component in the complex system. Unfortunately, due to the limited availability of modeling approaches tailored to neural networks, they are often developed separately from the other components of the system.
In model-driven software engineering, neural networks modeling remains largely overlooked, with existing approaches typically addressing only certain types of NNs or focusing on a specific application domain. As a result, there is a need for comprehensive and expressive neural network metamodels to help streamline NN development. To fill the gap, in our paper titled Modelling Neural Network Models and co-authored by Nadia Daoudi, Iván Alfonso, and Jordi Cabot, we have contributed an NN metamodel that encompasses fundamental concepts of neural networks along with its concrete syntax. This work will be presented at the international conference on Research Challenges in Information Science (RCIS) in May, 2025.
Additionally, we developed code generators for two popular deep learning frameworks: PyTorch and TensorFlow, and showed how our NN modelling approach can be integrated with the modelling of the global system. Our NN metamodel and code generators are developed as part of BESSER low-code platform. BESSER aims to facilitate software development as it offers the possibility to model and generate code for different system components such as: REST APIs backends and Flutter applications, in a unified and centralized framework.
BESSER for smart components developments
Components in BESSER are modeled using B-UML(BESSER’s Universal Modeling Language), which is the foundational language for specifying domain models. Submodules can be added to B-UML to model new system components, enabling the platform to evolve and accommodate additional functionalities as needed. We extend the B-UML language with an NN metamodel and a concrete syntax for specifying neural networks (NN) concepts. We also develop our two NN code generators within BESSER. We provide in Figure 1 an overview of our approach.

Figure 1: Overview of our model-based NN approach
In the first step, the user needs to provide an NN textual definition as input. This textual definition, conforms to a grammar developed with ANTLR, encapsulates information such as the layers, the hyper-parameters and the training and test datasets. Then, the textual definition is parsed into an NN model that is an instance of our NN metamodel. The metamodel is central to the architecture, as it abstracts the essential concepts of neural networks. Finally, the NN model is given as input to the code generators to produce neural networks code for PyTorch and TensorFlow frameworks. In the following, we delve deep into the specifics of the approach.
Neural Network Metamodel
Inspired by PyTorch and TensorFlow, we designed our metamodel to encapsulate important concepts of neural networks, including layers, parameters, datasets, training and performance evaluation. We note that our metamodel is not designed exclusively for PyTorch and TensorFlow; instead, it incorporates common concepts that are central to neural networks development. This approach makes our metamodel versatile and adaptable across different contexts within NN development. We present the class diagram of our NN metamodel in Figure 2.

Figure 2: The NN metamodel
The main concepts in the NN metamodel are represented using meta-classes and class associations. At its core, the metamodel contains the NeuralNetworkModel metaclass that comprises the fundamental properties and behaviors of a neural network model. A NeuralNetworkModel can contain three types of components, referred to as modules: Layers, TensorOps and (sub)-NeuralNetworkModels. Specifically, a NeuralNetworkModel can incorporate another NeuralNetworkModel as a component (contains relationship in the metamodel). This is common in neural networks, where an NN model can be structured to include other NN models as part of its architecture.
The metamodel defines various types of layers such as convolutional, recurrent, and linear layers, with their parameters. TensorOp is used to represent an operation or function applied to tensors. Training and test datasets are represented using the Dataset metaclass. As for the training and validation processes, their parameters are encapsulated in the Configuration metaclass.
To instantiate the concepts of our metamodel, we created a textual notation supported by a grammar developed with ANTLR. We provide in Listing 1 an excerpt of the textual notation of an NN model proposed in a TensorFlow tutorial. The complete grammar is available in the BESSER code repository.

Listing 1: NN textual model example
The model definition begins by specifying the NN’s name (my_model). Next, three layers are defined (lines 2-19), with l1 and l3 being 2D Convolutional layers, and l2 as a Pooling layer. Then, the modules definition (lines 21-22) specifies the order of the layers. Finally, the configuration is defined (lines 23-29), such as the “adam” optimiser. The full textual model can be accessed in the project repository.
Code generation
BESSER offers a code generation interface that can be implemented to add new code generators. We leveraged Jinja template engine and implemented the BESSER interface to develop the NN code generators. We relied on Model-to-Text transformations to map neural networks concepts to their equivalent code in the target framework. PyTorch and TensorFlow code generated for the NN model defined Listing 1 is available in our code repository.
Integration with the Overall System
To facilitate seamless integration of neural networks within the global system architecture, we show how our approach can be integrated with a more general modeling language, such as UML. In practice, NNs can be considered as behavior implementations since invoking a neural network can be tied to the declaration of a behavior for a specific component in the system. For example, a method executed in one of the states of a state machine could implement a neural network to perform prediction tasks required for decision-making.

Figure 3: Integration of our NN modelling approach with the UML language at metamodel level. This class diagram shows only the relevant classes to illustrate the integration.
Figure 3 shows the integration of our NeuralNetworkModel class with concepts from the UML modelling language. Integration is done at the metamodel level by combining the NN metamodel and the UML metamodel. Specifically, UML enables the specification of various model types, such as state machine models and structural models. Some concepts defined in these models can also define a behavior declaration (BehaviorDeclaration class in Figure 3) to describe dynamic aspects of the system. For instance, classes in a structural model or methods in a state machine can declare one or more behaviors.
On the other hand, the BehaviorImplementation concept provides the concrete implementation of behaviors defined by BehavioralDeclaration. This implementation can represent specific actions that a class performs under certain conditions or in response to particular events. Both NeuralNetworkModel and StateMachineModel can be modeled as concrete implementations of a behavior.
With this extension, any system modeled with UML can use an NN defined with our approach as an associated behavior for its subsystems.
The NN metamodel, textual notation, and PyTorch and TensorFlow code generators are all available in BESSER’s repository. This work is a first step in BESSER’s smart components development. New features and functionalities are planned to be implemented in the future.
Simply delicious… my congratulations