Commit 6ab984d6 authored by Thanassis Tsiodras's avatar Thanassis Tsiodras
Browse files

Merge branch 'feature_buster' into CoRA-ZynQ

parents 0bfdc43a cb38fff2
Pipeline #2836 failed with stage
in 34 seconds
# Change Log
## 2.1.2
- Use Unsigned_32 in Dataview.aadl to support large messages
## 2.1.1
- Moved to statically imported (and verifiable by mypy) A and B mappers
## 2.1.0
- Major update of the PySide B mapper (API updates)
- Bug fix in the Python A mapper
## 2.0.0
- Moved to Python3
- Added type annotations, checked via mypy
- Added git commit hooks to check via flake8 AND pylint
- Configuration files used to customize the checks for the project's
naming conventions.
- Many bugs identified and fixed.
## 1.2.3
- ctypes backend: emit all ENUMERATED values in DV.py
## 1.2.0
- Not using SWIG anymore for the Python mappers
## 1.1.2
- pyside_b_mapper: added combo box for the asn1 value editor
## 1.1.1
- support empty SEQUENCEs (MyType ::= SEQUENCE {}) - except for Simulink
PY_SRC:=$(wildcard dmt/asn2dataModel.py dmt/aadl2glueC.py dmt/smp2asn.py dmt/*mappers/[a-zA-Z]*py dmt/commonPy/[a-zA-Z]*py)
PY_SRC:=$(filter-out dmt/B_mappers/antlr.main.py dmt/A_mappers/Stubs.py, ${PY_SRC})
PY_SRC:=$(filter-out dmt/B_mappers/antlr.main.py dmt/A_mappers/Stubs.py dmt/B_mappers/micropython_async_B_mapper.py dmt/commonPy/commonSMP2.py, ${PY_SRC})
# Python3.5 includes an older version of typing, which by default has priority over
# the one installed in $HOME/.local via setup.py.
......@@ -25,11 +25,11 @@ flake8:
pylint:
@echo Performing static analysis via pylint...
@pylint --disable=I --rcfile=pylint.cfg ${PY_SRC} | grep -v '^$$' | sed -n '/^Report/q;p'
@pylint --disable=I --rcfile=pylint.cfg ${PY_SRC}
mypy:
@echo Performing type analysis via mypy...
@mypy --disallow-untyped-defs --check-untyped-defs --ignore-missing-imports ${PY_SRC} || exit 1
@mypy ${PY_SRC} || exit 1
coverage:
@echo Performing coverage checks...
......
[![Build and Test Status of Data Modelling Tools on Circle CI](https://circleci.com/gh/ttsiodras/DataModellingTools.svg?&style=shield&circle-token=9df10d36b6b4ccd923415a5890155b7bf54b95c5)](https://circleci.com/gh/ttsiodras/DataModellingTools/tree/master)
[![Build and Test Status of Data Modelling Tools on Gitlab CI](https://gitrepos.estec.esa.int/taste/dmt/badges/master/pipeline.svg)](https://gitrepos.estec.esa.int/taste/dmt/-/commits/master)
TASTE Data Modelling Tools
==========================
These are the tools used by the European Space Agency's [TASTE toolchain](https://taste.tools/)
to automate handling of the Data Modelling. They include more than two
dozen codegenerators that automatically create the 'glue'; the run-time translation
dozen codegenerators that automatically create the 'glue'; that is, the run-time translation
bridges that allow code generated by modelling tools (Simulink, SCADE, OpenGeode, etc)
to "speak" to one another, via ASN.1 marshalling.
......@@ -21,6 +21,11 @@ Installation
For using the tools, this should suffice:
$ sudo apt-get install libxslt1-dev libxml2-dev zlib1g-dev python3-pip
$ ./configure
$ # Optionally, configure a Python virtual environment (via venv)
$ # to avoid "polluting" your system-level Python with dependencies
$ # you may not want.
# # But whether with an activated venv or not, you end with:
$ pip3 install --user --upgrade .
For developing the tools, the packaged Makefile allow for easy static-analysis
......@@ -65,7 +70,7 @@ What is packaged:
Reads the AADL specification of the system, and then generates the runtime
bridge-code that will map the message data structures from those generated
by [ASN1SCC](https://github.com/ttsiodras/asn1scc) to/from those generated
by the modeling tool (that is used to functionally model the subsystem -
by the modeling tool (that is used to functionally model the subsystem;
e.g. SCADE, ObjectGeode, Matlab/Simulink, C, Ada, etc).
Contact
......@@ -79,7 +84,7 @@ contact me at:
System, Software and Technology Department
European Space Agency
ESTEC
ESTEC / TEC-SWT
Keplerlaan 1, PO Box 299
NL-2200 AG Noordwijk, The Netherlands
Athanasios.Tsiodras@esa.int | www.esa.int
......
......@@ -608,6 +608,7 @@ infodir
docdir
oldincludedir
includedir
runstatedir
localstatedir
sharedstatedir
sysconfdir
......@@ -672,6 +673,7 @@ datadir='${datarootdir}'
sysconfdir='${prefix}/etc'
sharedstatedir='${prefix}/com'
localstatedir='${prefix}/var'
runstatedir='${localstatedir}/run'
includedir='${prefix}/include'
oldincludedir='/usr/include'
docdir='${datarootdir}/doc/${PACKAGE_TARNAME}'
......@@ -924,6 +926,15 @@ do
| -silent | --silent | --silen | --sile | --sil)
silent=yes ;;
-runstatedir | --runstatedir | --runstatedi | --runstated \
| --runstate | --runstat | --runsta | --runst | --runs \
| --run | --ru | --r)
ac_prev=runstatedir ;;
-runstatedir=* | --runstatedir=* | --runstatedi=* | --runstated=* \
| --runstate=* | --runstat=* | --runsta=* | --runst=* | --runs=* \
| --run=* | --ru=* | --r=*)
runstatedir=$ac_optarg ;;
-sbindir | --sbindir | --sbindi | --sbind | --sbin | --sbi | --sb)
ac_prev=sbindir ;;
-sbindir=* | --sbindir=* | --sbindi=* | --sbind=* | --sbin=* \
......@@ -1061,7 +1072,7 @@ fi
for ac_var in exec_prefix prefix bindir sbindir libexecdir datarootdir \
datadir sysconfdir sharedstatedir localstatedir includedir \
oldincludedir docdir infodir htmldir dvidir pdfdir psdir \
libdir localedir mandir
libdir localedir mandir runstatedir
do
eval ac_val=\$$ac_var
# Remove trailing slashes.
......@@ -1214,6 +1225,7 @@ Fine tuning of the installation directories:
--sysconfdir=DIR read-only single-machine data [PREFIX/etc]
--sharedstatedir=DIR modifiable architecture-independent data [PREFIX/com]
--localstatedir=DIR modifiable single-machine data [PREFIX/var]
--runstatedir=DIR modifiable per-process data [LOCALSTATEDIR/run]
--libdir=DIR object code libraries [EPREFIX/lib]
--includedir=DIR C header files [PREFIX/include]
--oldincludedir=DIR C header files for non-gcc [/usr/include]
......
......@@ -8,7 +8,7 @@
import re
import os
from typing import Union, List # NOQA pylint: disable=unused-import
from typing import Union, List, IO, Any # NOQA pylint: disable=unused-import
from ..commonPy import asnParser
from ..commonPy.utility import panic, inform
......@@ -21,11 +21,11 @@ from ..commonPy.asnParser import AST_Lookup, AST_Leaftypes
from ..commonPy.cleanupNodes import SetOfBadTypenames
# The Python file written to
g_outputFile = None
g_outputFile: IO[Any]
# The SETers and GETers files
g_outputGetSetH = None
g_outputGetSetC = None
g_outputGetSetH: IO[Any]
g_outputGetSetC: IO[Any]
g_bHasStartupRunOnce = False
......@@ -547,7 +547,7 @@ def DumpTypeDumper(
lines.append(codeIndent + "state = self.GetState()")
lines.append(codeIndent + "length = %s.GetLength()" % variableName)
lines.append(codeIndent + "self.Reset(state)")
lines.append(codeIndent + "map(partial(emitElem, %s), range(length))" % variableName)
lines.append(codeIndent + "list(map(partial(emitElem, %s), range(length)))" % variableName)
lines.append(codeIndent + 'self.Reset(state)')
lines.append(codeIndent + 'lines.append("}")')
......
......@@ -30,7 +30,7 @@ import sys
import re
from distutils import spawn
from typing import List, Union, Set # NOQA
from typing import List, Union, Set, IO, Any # NOQA
from ..commonPy import asnParser
from ..commonPy.utility import panic, inform
......@@ -44,7 +44,7 @@ from ..commonPy.asnAST import AsnNode # NOQA pylint: disable=unused-import
from ..commonPy.asnParser import AST_Leaftypes, AST_Lookup
# The file written to
g_outputFile = None
g_outputFile: IO[Any]
# A map of the ASN.1 types defined so far
g_definedTypes = set() # type: Set[asnParser.Typename]
......
......@@ -23,7 +23,7 @@
#
import re
from typing import Union, Set, List # NOQA pylint: disable=unused-import
from typing import Union, Set, List, IO, Any # NOQA pylint: disable=unused-import
from ..commonPy.utility import panic, inform
from ..commonPy import asnParser
......@@ -37,7 +37,7 @@ from ..commonPy.cleanupNodes import SetOfBadTypenames
from ..commonPy.asnParser import AST_Leaftypes, AST_Lookup
# The file written to
g_outputFile = None
g_outputFile: IO[Any]
# A map of the ASN.1 types defined so far
g_definedTypes = set() # type: Set[asnParser.Typename]
......
......@@ -28,7 +28,7 @@ import os
import random
from xml.dom.minidom import Document, Node # type: ignore # NOQA pylint: disable=unused-import
from typing import Union, Set, Dict # NOQA pylint: disable=unused-import
from typing import Optional, Union, Set, Dict, IO, Any # NOQA pylint: disable=unused-import
from ..commonPy.utility import inform, panic
from ..commonPy.asnAST import (
......@@ -47,7 +47,7 @@ g_lookup = {
}
# The file written to
g_outputFile = None
g_outputFile: IO[Any]
# The assigned OIDs
g_oid = {} # type: Dict[str, str]
......@@ -62,7 +62,7 @@ g_currOid = 0x1f00
g_declaredTypes = set() # type: Set[str]
# The DOM elements
g_doc = None
g_doc: Document
g_Declarations = None
......@@ -142,7 +142,7 @@ def RenderElements(controlString: str) -> None:
for elem in controlString.split(","):
if '`' in elem:
element = elem.split("`")[0]
under = elem.split("`")[1]
under = elem.split("`")[1] # type: Optional[str]
else:
element = elem
under = None
......
......@@ -20,7 +20,7 @@
#
import re
from typing import Union, Set, List # NOQA pylint: disable=unused-import
from typing import Union, Set, List, IO, Any # NOQA pylint: disable=unused-import
from ..commonPy.utility import panic, inform
from ..commonPy import asnParser
......@@ -35,7 +35,7 @@ from ..commonPy.cleanupNodes import SetOfBadTypenames
# The file written to
g_outputFile = None
g_outputFile: IO[Any]
# A map of the ASN.1 types defined so far
g_definedTypes = set() # type: Set[str]
......
......@@ -34,11 +34,11 @@ from ..commonPy.cleanupNodes import SetOfBadTypenames
from ..commonPy.asnAST import AsnBasicNode, AsnSequenceOrSet, AsnSequenceOrSetOf, AsnEnumerated
from ..commonPy.asnParser import AST_Leaftypes
g_catalogueXML = None # type: IO[Any]
g_catalogueXML: IO[Any]
g_innerTypes = set() # type: Set[str]
g_uniqueStringOfASN1files = ""
g_outputDir = "."
g_asnFiles = None
g_asnFiles: List[str]
def Version() -> None:
......
......@@ -35,11 +35,11 @@ from ..commonPy.utility import panic, warn
from ..commonPy.cleanupNodes import SetOfBadTypenames
from ..commonPy.asnParser import AST_Leaftypes
g_sqlOutput = None # type: IO[Any]
g_sqlOutput: IO[Any]
g_innerTypes = set() # type: Set[str]
g_uniqueStringOfASN1files = ""
g_outputDir = "."
g_asnFiles = None
g_asnFiles: List[str]
# ====== Dummy stubs =====
......
......@@ -34,11 +34,11 @@ from ..commonPy.asnParser import g_names, g_leafTypeDict, CleanNameForAST, AST_L
from ..commonPy.utility import panic, warn
from ..commonPy.cleanupNodes import SetOfBadTypenames
g_sqlalchemyOutput = None # type: IO[Any]
g_sqlalchemyOutput: IO[Any]
g_innerTypes = {} # type: Dict[str, int]
g_uniqueStringOfASN1files = ""
g_outputDir = "."
g_asnFiles = None # type: Union[str, List[str]]
g_asnFiles: Union[str, List[str]]
# ====== Dummy stubs =====
......
......@@ -45,7 +45,7 @@ from ..commonPy.asnAST import AsnNode
from ..commonPy.asnParser import AST_Lookup, AST_Leaftypes
isAsynchronous = True
cBackend = None
cBackend: c_B_mapper.C_GlueGenerator
def OnStartup(unused_modelingLanguage: str, asnFile: str, outputDir: str, maybeFVname: str, useOSS: bool) -> None:
......
......@@ -73,11 +73,9 @@ class ASynchronousToolGlueGenerator:
def __init__(self) -> None:
# The files written to
self.C_HeaderFile = None # type: IO[Any]
self.C_SourceFile = None # type: IO[Any]
self.asn_name = ""
self.supportedEncodings = ['native', 'uper', 'acn']
self.useOSS = None # type: bool
self.useOSS = False
self.typesToWorkOn = {} # type: Dict[str, Tuple[AsnNode, AST_Leaftypes, AST_Lookup]]
def OnStartup(self, modelingLanguage: str, asnFile: str, outputDir: str, maybeFVname: str, useOSS: bool) -> None:
......@@ -89,10 +87,10 @@ class ASynchronousToolGlueGenerator:
outputCsourceFilename = self.CleanNameAsToolWants(prefix) + "_ASN1_Types.c"
inform(str(self.__class__) + ": Creating file '%s'...", outputCheaderFilename)
self.C_HeaderFile = open(outputDir + outputCheaderFilename, 'w')
self.C_HeaderFile = open(outputDir + outputCheaderFilename, 'w') # pylint: disable=attribute-defined-outside-init
inform(str(self.__class__) + ": Creating file '%s'...", outputCsourceFilename)
self.C_SourceFile = open(outputDir + outputCsourceFilename, 'w')
self.C_SourceFile = open(outputDir + outputCsourceFilename, 'w') # pylint: disable=attribute-defined-outside-init
self.asn_name = os.path.basename(os.path.splitext(asnFile)[0])
......@@ -100,11 +98,14 @@ class ASynchronousToolGlueGenerator:
ID = re.sub(r'[^A-Za-z0-9_]', '_', ID).upper()
self.C_HeaderFile.write("#ifndef __%s_H__\n" % ID)
self.C_HeaderFile.write("#define __%s_H__\n\n" % ID)
self.C_HeaderFile.write("#ifdef __unix__\n")
self.C_HeaderFile.write("#if defined( __unix__ ) || defined( __MSP430__ )\n")
self.C_HeaderFile.write("#include <stdlib.h> /* for size_t */\n")
self.C_HeaderFile.write("#else\n")
self.C_HeaderFile.write("typedef unsigned size_t;\n")
self.C_HeaderFile.write("#endif\n\n")
self.C_HeaderFile.write("#ifndef STATIC\n")
self.C_HeaderFile.write("#define STATIC\n")
self.C_HeaderFile.write("#endif\n\n")
self.C_HeaderFile.write("\n")
self.C_SourceFile.write("#ifdef __unix__\n")
self.C_SourceFile.write("#include <stdio.h>\n")
......@@ -115,8 +116,6 @@ class ASynchronousToolGlueGenerator:
self.HeadersOnStartup(asnFile, outputDir, maybeFVname)
self.typesToWorkOn = {} # type: Dict[str, Tuple[AsnNode, AST_Leaftypes, AST_Lookup]]
def Common(self, nodeTypename: str, node: AsnNode, leafTypeDict: AST_Leaftypes, names: AST_Lookup) -> None:
# Async backends are different: they work on ASN.1 types, not SP params.
......
......@@ -48,7 +48,6 @@ from ..commonPy.recursiveMapper import RecursiveMapper
from .asynchronousTool import ASynchronousToolGlueGenerator
isAsynchronous = True
cBackend = None
# noinspection PyListCreation
......@@ -429,6 +428,9 @@ class C_GlueGenerator(ASynchronousToolGlueGenerator):
self.C_SourceFile.write("#endif\n\n")
cBackend: C_GlueGenerator
def OnStartup(modelingLanguage: str, asnFile: str, outputDir: str, maybeFVname: str, useOSS: bool) -> None:
global cBackend
cBackend = C_GlueGenerator()
......
......@@ -35,20 +35,20 @@ from ..commonPy import asnParser
from ..commonPy.asnParser import AST_Lookup, AST_Leaftypes
from ..commonPy.aadlAST import ApLevelContainer, Param
g_HeaderFile = None # type: IO[Any]
g_SourceFile = None # type: IO[Any]
g_GnuplotFile = None # type: IO[Any]
g_MyEvents = None
g_MyCreation = None
g_MyClickPrototypes = None
g_MyAction = None
g_MyControls = None
g_MyLoad = None
g_MySave = None
g_MyThreadsInc = None
g_MyThreadsH = None
g_MyTelemetryActions = None
g_HeaderFile: IO[Any]
g_SourceFile: IO[Any]
g_GnuplotFile: IO[Any]
g_MyEvents: IO[Any]
g_MyCreation: IO[Any]
g_MyClickPrototypes: IO[Any]
g_MyAction: IO[Any]
g_MyControls: IO[Any]
g_MyLoad: IO[Any]
g_MySave: IO[Any]
g_MyThreadsInc: IO[Any]
g_MyThreadsH: IO[Any]
g_MyTelemetryActions: IO[Any]
g_bStarted = False
g_IDs = 20000
......
......@@ -50,7 +50,7 @@ from .c_B_mapper import C_GlueGenerator
backend_C = None
backend_uPy = None
backends = None
backends = [] # type: List[ASynchronousToolGlueGenerator]
# TODO replace most of the typedefs with an include of py/obj.h
h_header_str = """
......@@ -213,7 +213,7 @@ class MapUPyObjData(RecursiveMapperGeneric[str, str]):
for child in node._members:
contained = self.Map(srcVar, destVar, child[1], leafTypeDict, names)
if contained:
lines.extend(' ' + l for l in contained[:-1])
lines.extend(' ' + ll for ll in contained[:-1])
lines.append(' %s data_%s;' % (contained[-1], self.CleanName(child[0])))
lines.append('}')
return lines
......@@ -231,7 +231,7 @@ class MapUPyObjData(RecursiveMapperGeneric[str, str]):
for child in node._members:
contained = self.Map(srcVar, destVar, child[1], leafTypeDict, names)
if contained:
lines.extend(' ' + l for l in contained[:-1])
lines.extend(' ' + ll for ll in contained[:-1])
lines.append(' %s %s;' % (contained[-1], self.CleanName(child[0])))
lines.append(' } data;')
lines.append('}')
......@@ -246,7 +246,7 @@ class MapUPyObjData(RecursiveMapperGeneric[str, str]):
]
contained = self.Map(srcVar, destVar, node._containedType, leafTypeDict, names)
if contained:
lines.extend(' ' + l for l in contained[:-1])
lines.extend(' ' + ll for ll in contained[:-1])
lines.append(' %s data[%u];' % (contained[-1], num_items))
lines.append('}')
return lines
......@@ -346,8 +346,8 @@ class MapUPyObjEncode(RecursiveMapperGeneric[str, Tuple[str, str]]):
])
for it, child in enumerate(node._members):
lines.append('%sif ((%s).kind == %s) {' % ('else ' if it else '', srcVar, self.CleanName(child[2])))
lines.extend(' ' + l
for l in self.Map(
lines.extend(' ' + ll
for ll in self.Map(
"(%s).u.%s" % (srcVar, self.CleanName(child[0])),
('(%s)->items[0]' % data, '&(%s)->data.%s' % (data, self.CleanName(child[0]))),
child[1],
......@@ -370,8 +370,8 @@ class MapUPyObjEncode(RecursiveMapperGeneric[str, Tuple[str, str]]):
'(%s)->list.items = &(%s)->items[0];' % (data, data),
'for (size_t %s = 0; %s < %s; ++%s) {' % (it, it, limit, it),
]
lines.extend(' ' + l
for l in self.Map(
lines.extend(' ' + ll
for ll in self.Map(
'(%s).arr[%s]' % (srcVar, it),
('(%s)->items[%s]' % (data, it), '&(%s)->data[%s]' % (data, it)),
node._containedType, leafTypeDict, names))
......@@ -446,8 +446,8 @@ class MapUPyObjDecode(RecursiveMapperGeneric[str, str]):
lines.append('extern qstr %s[1];' % f)
for it, child in enumerate(node._members):
lines.append('%sif (MP_OBJ_TO_PTR(((mp_obj_tuple_t*)MP_OBJ_TO_PTR(%s))->items[1]) == %s) {' % ('else ' if it else '', srcVar, fields_names[it]))
lines.extend(' ' + l
for l in self.Map(
lines.extend(' ' + ll
for ll in self.Map(
'((mp_obj_tuple_t*)MP_OBJ_TO_PTR(%s))->items[0]' % srcVar,
'(%s).u.%s' % (destVar, self.CleanName(child[0])),
child[1],
......@@ -467,8 +467,8 @@ class MapUPyObjDecode(RecursiveMapperGeneric[str, str]):
lines = [
'for (size_t %s = 0; %s < %s; ++%s) {' % (it, it, limit, it),
]
lines.extend(' ' + l
for l in self.Map(
lines.extend(' ' + ll
for ll in self.Map(
'((mp_obj_list_t*)MP_OBJ_TO_PTR(%s))->items[%s]' % (srcVar, it),
'%s.arr[%s]' % (destVar, it),
node._containedType, leafTypeDict, names))
......@@ -548,7 +548,7 @@ class MicroPython_GlueGenerator(ASynchronousToolGlueGenerator):
if dataType == 'void':
lines.append(' (void)pData;') # suppress C compiler warning
lines.append(' mp_obj_t o;')
lines.extend(' ' + l for l in self.MapUPyObjEncode.Map('*pVal', ('o', 'pData'), node, leafTypeDict, names))
lines.extend(' ' + ll for ll in self.MapUPyObjEncode.Map('*pVal', ('o', 'pData'), node, leafTypeDict, names))
lines.append(' return o;')
lines.append('}')
self.C_SourceFile.write('\n'.join(lines) + '\n\n')
......@@ -573,7 +573,7 @@ class MicroPython_GlueGenerator(ASynchronousToolGlueGenerator):
lines = []
lines.append('void mp_obj_decode_asn1Scc%s(mp_obj_t obj, asn1Scc%s *pVal) {' % (tname, tname))
lines.extend(' ' + l for l in self.MapUPyObjDecode.Map('obj', '(*pVal)', node, leafTypeDict, names))
lines.extend(' ' + ll for ll in self.MapUPyObjDecode.Map('obj', '(*pVal)', node, leafTypeDict, names))
lines.append('}')
self.C_SourceFile.write('\n'.join(lines) + '\n\n')
......
......@@ -100,7 +100,6 @@ from ..commonPy.recursiveMapper import RecursiveMapper
from .asynchronousTool import ASynchronousToolGlueGenerator
isAsynchronous = True
ogBackend = None
# noinspection PyListCreation
......@@ -800,6 +799,9 @@ class OG_GlueGenerator(ASynchronousToolGlueGenerator):
# (self.CleanNameAsToolWants(nodeTypename), self.CleanNameAsToolWants(nodeTypename)))
ogBackend: OG_GlueGenerator
def OnStartup(modelingLanguage: str, asnFile: str, outputDir: str, maybeFVname: str, useOSS: bool) -> None:
global ogBackend
ogBackend = OG_GlueGenerator()
......
......@@ -3,7 +3,7 @@
import re
import os
from typing import List
from typing import List, IO, Any
from ..commonPy.asnAST import (
AsnInt, AsnBool, AsnReal, AsnEnumerated,
......@@ -14,13 +14,13 @@ from ..commonPy.utility import panic
from ..commonPy.asnParser import AST_Lookup, AST_Leaftypes
from ..commonPy.aadlAST import ApLevelContainer, Param
g_PyDataModel = None
g_PyDataModel: IO[Any]
g_iter = 1
g_IFCount = 0
g_BackendFile = None
g_BackendFile: IO[Any]
g_fromPysideToASN1 = [] # type: List[str]
g_fromASN1ToPyside = [] # type: List[str]
g_QUiFile = None
g_QUiFile: IO[Any]
g_bStarted = False # type: bool
g_firstElem = True # type: bool
g_asnId = "" # type: str
......
......@@ -279,6 +279,8 @@ def OnShutdown(unused_modelingLanguage: str, unused_asnFile: str, unused_sp: ApL
def OnFinal() -> None:
assert g_HeaderFile is not None
assert g_PythonFile is not None
g_HeaderFile.write("\n#endif\n")
g_PythonFile.write('\n'.join(g_headerPython))
g_PythonFile.write('\n\n')
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment