...
 
Commits (237)
[flake8]
ignore = E501,E225,C103
max-line-length = 160
*.pyc
*,cover
.coverage
tests-coverage/output
tests-coverage/smp2.asn
tests-coverage/datatypessimulink.cat
tests-coverage/datatypessimulink.pkg
tests-coverage/Simulink_DataView_asn.m
*swp
antlr-2.7.7
asn1scc
# Change Log
## 2.1.2
- Use Unsigned_32 in Dataview.aadl to support large messages
## 2.1.1
- Moved to statically imported (and verifiable by mypy) A and B mappers
## 2.1.0
- Major update of the PySide B mapper (API updates)
- Bug fix in the Python A mapper
## 2.0.0
- Moved to Python3
- Added type annotations, checked via mypy
- Added git commit hooks to check via flake8 AND pylint
- Configuration files used to customize the checks for the project's
naming conventions.
- Many bugs identified and fixed.
## 1.2.3
- ctypes backend: emit all ENUMERATED values in DV.py
## 1.2.0
- Not using SWIG anymore for the Python mappers
## 1.1.2
- pyside_b_mapper: added combo box for the asn1 value editor
## 1.1.1
- support empty SEQUENCEs (MyType ::= SEQUENCE {}) - except for Simulink
PY_SRC:=$(wildcard dmt/asn2dataModel.py dmt/aadl2glueC.py dmt/smp2asn.py dmt/*mappers/[a-zA-Z]*py dmt/commonPy/[a-zA-Z]*py)
PY_SRC:=$(filter-out dmt/B_mappers/antlr.main.py dmt/A_mappers/Stubs.py, ${PY_SRC})
# Python3.5 includes an older version of typing, which by default has priority over
# the one installed in $HOME/.local via setup.py.
#
# To address this, we find where our pip-installed typing lives:
TYPING_FOLDER:=$(shell pip3 show typing | grep ^Location | sed 's,^.*: ,,')
export PYTHONPATH=${TYPING_FOLDER}
all: flake8 pylint mypy coverage
flake8:
@echo Performing syntax checks via flake8...
@flake8 ${PY_SRC} || exit 1
pylint:
@echo Performing static analysis via pylint...
@pylint --disable=I --rcfile=pylint.cfg ${PY_SRC} | grep -v '^$$' | sed -n '/^Report/q;p'
mypy:
@echo Performing type analysis via mypy...
@mypy --disallow-untyped-defs --check-untyped-defs ${PY_SRC} || exit 1
coverage:
@echo Performing coverage checks...
@$(MAKE) -C tests-coverage || exit 1
.PHONY: flake8 pylint mypy coverage
TASTE Data Modelling Technologies
=================================
[![Build and Test Status of Data Modelling Tools on Circle CI](https://circleci.com/gh/ttsiodras/DataModellingTools.svg?&style=shield&circle-token=9df10d36b6b4ccd923415a5890155b7bf54b95c5)](https://circleci.com/gh/ttsiodras/DataModellingTools/tree/master)
TASTE Data Modelling Tools
==========================
These are the tools used by the European Space Agency's [TASTE toolchain](https://taste.tuxfamily.org/)
to automate handling of the Data Modelling. They include more than two
dozen codegenerators that automatically create the 'glue'; the run-time translation
bridges that allow code generated by modelling tools (Simulink, SCADE, OpenGeode, etc)
to "speak" to one another, via ASN.1 marshalling.
For the encoders and decoders of the messages
themselves, TASTE uses [ASN1SCC](https://github.com/ttsiodras/asn1scc) - an ASN.1
compiler specifically engineered for safety-critical environments.
For more details, visit the [TASTE site](https://taste.tuxfamily.org/).
Installation
------------
For using the tools, this should suffice:
$ sudo apt-get install libxslt1-dev libxml2-dev zlib1g-dev python3-pip
$ pip3 install --user --upgrade .
For developing the tools, the packaged Makefile allow for easy static-analysis
via the dominant Python static analyzers and syntax checkers:
$ make flake8 # check for pep8 compliance
$ make pylint # static analysis with pylint
$ make mypy # type analysis with mypy
Contents
--------
What is packaged:
- **commonPy** (*library*)
Contains the basic API for parsing ASN.1 (via invocation of
[ASN1SCC](https://github.com/ttsiodras/asn1scc) and simplification
of the generated XML AST representation to the Python classes
inside `asnAST.py`.
inside `asnAST.py`. The class diagram with the AST classes
is [packaged in the code](dmt/commonPy/asnAST.py#L42).
- **asn2aadlPlus** (*utility*)
Converts the type declarations inside ASN.1 grammars to AADL
declarations (used by the Ellidiss GUI to create the final systems)
declarations, that are used by [Ocarina](https://github.com/OpenAADL/ocarina)
to generate the executable containers.
- **asn2dataModel** (*utility*)
Reads the ASN.1 specification of the exchanged messages, and generates
the semantically equivalent Modeling tool/Modeling language declarations
(e.g. SCADE/Lustre, Matlab/Simulink statements, etc).
(e.g. SCADE/Lustre, Matlab/Simulink, etc).
The actual mapping logic exists in plugins, called *A mappers*
(`simulink_A_mapper.py` handles Simulink/RTW, `scade6_A_mapper.py`
handles SCADE5, `ada_A_mapper.py` generates Ada types,
handles SCADE6, `ada_A_mapper.py` generates Ada types,
`sqlalchemy_A_mapper.py`, generates SQL definitions via SQLAlchemy, etc)
- **aadl2glueC** (*utility*)
......@@ -29,7 +65,22 @@ TASTE Data Modelling Technologies
Reads the AADL specification of the system, and then generates the runtime
bridge-code that will map the message data structures from those generated
by [ASN1SCC](https://github.com/ttsiodras/asn1scc) to/from those generated
by the modeling tool used to functionally model the subsystem (e.g. SCADE,
ObjectGeode, Matlab/Simulink, C, Ada, etc).
by the modeling tool (that is used to functionally model the subsystem -
e.g. SCADE, ObjectGeode, Matlab/Simulink, C, Ada, etc).
Contact
-------
For bug reports, please use the Issue Tracker; for any other communication,
contact me at:
Thanassis Tsiodras
Real-time Embedded Software Engineer
System, Software and Technology Department
European Space Agency
ESTEC
Keplerlaan 1, PO Box 299
NL-2200 AG Noordwijk, The Netherlands
Athanasios.Tsiodras@esa.int | www.esa.int
T +31 71 565 5332
#
# (C) Semantix Information Technologies.
#
# Semantix Information Technologies is licensing the code of the
# Data Modelling Tools (DMT) in the following dual-license mode:
#
# Commercial Developer License:
# The DMT Commercial Developer License is the suggested version
# to use for the development of proprietary and/or commercial software.
# This version is for developers/companies who do not want to comply
# with the terms of the GNU Lesser General Public License version 2.1.
#
# GNU LGPL v. 2.1:
# This version of DMT is the one to use for the development of
# applications, when you are willing to comply with the terms of the
# GNU Lesser General Public License version 2.1.
#
# Note that in both cases, there are no charges (royalties) for the
# generated code.
#
'''
This is the implementation of the code mapper for Ada code.
As initially envisioned, ASSERT technology is not supposed
to support manually-made systems. A migration path, however,
that allows legacy hand-written code and modelling-tool
generated code to co-exist, can be beneficial in allowing
for a smooth transition. To that end, this backend (as well as
the C one) are written.
This is a backend for Semantix's code generator B (aadl2glueC).
Ada is a member of the asynchronous "club" (SDL, etc);
The subsystem developer (or rather, the APLC developer) is using
native Ada code to work with code generated by modelling tools.
To that end, this backend creates "glue" functions for input and
output parameters, which have Ada callable interfaces.
'''
# from commonPy.utility import panic
# from recursiveMapper import RecursiveMapper
# from asynchronousTool import ASynchronousToolGlueGenerator
import c_B_mapper
isAsynchronous = True
adaBackend = None
cBackend = None
def Version():
print "Code generator: " + "$Id: ada_B_mapper.py 2382 2012-06-22 08:35:33Z ttsiodras $" # pragma: no cover
# All the ada B mapper is now Obsolete, we are using ASN1SCC for Dumpables
#
# class FromDumpableCtoASN1SCC(RecursiveMapper):
# def __init__(self):
# self.uniqueID = 0
# def UniqueID(self):
# self.uniqueID += 1
# return self.uniqueID
# def DecreaseUniqueID(self):
# self.uniqueID -= 1
# def MapInteger(self, srcCVariable, destVar, _, __, ___):
# return ["%s = %s;\n" % (destVar, srcCVariable)]
# def MapReal(self, srcCVariable, destVar, _, __, ___):
# return ["%s = %s;\n" % (destVar, srcCVariable)]
# def MapBoolean(self, srcCVariable, destVar, _, __, ___):
# return ["%s = %s;\n" % (destVar, srcCVariable)]
# def MapOctetString(self, srcCVariable, destVar, _, __, ___):
# lines = []
# lines.append("{\n")
# lines.append(" int i;\n")
# lines.append(" for(i=0; i<%s.length; i++)\n" % srcCVariable)
# lines.append(" %s.arr[i] = %s.content[i];\n" % (destVar, srcCVariable))
# lines.append(" %s.nCount = %s.length;\n" % (destVar, srcCVariable))
# lines.append("}\n")
# return lines
# def MapEnumerated(self, srcCVariable, destVar, _, __, ___):
# return ["%s = %s;\n" % (destVar, srcCVariable)]
# def MapSequence(self, srcCVariable, destVar, node, leafTypeDict, names):
# lines = []
# for child in node._members:
# lines.extend(
# self.Map(
# "%s.%s" % (srcCVariable, self.CleanName(child[0])),
# destVar + "." + self.CleanName(child[0]),
# child[1],
# leafTypeDict,
# names))
# return lines
# def MapSet(self, srcCVariable, destVar, node, leafTypeDict, names):
# return self.MapSequence(srcCVariable, destVar, node, leafTypeDict, names)
# def MapChoice(self, srcCVariable, destVar, node, leafTypeDict, names):
# lines = []
# childNo = 0
# for child in node._members:
# childNo += 1
# lines.append("%sif (%s.choiceIdx == %d) {\n" %
# (self.maybeElse(childNo), srcCVariable, childNo))
# lines.extend([' '+x for x in self.Map(
# "%s.u.%s" % (srcCVariable, self.CleanName(child[0])),
# destVar + ".u." + self.CleanName(child[0]),
# child[1],
# leafTypeDict,
# names)])
# lines.append(" %s.kind = %s_PRESENT;\n" % (destVar, self.CleanName(child[0])))
# lines.append("}\n")
# return lines
# def MapSequenceOf(self, srcCVariable, destVar, node, leafTypeDict, names):
# lines = []
# lines.append("{\n")
# uniqueId = self.UniqueID()
# lines.append(" int i%s;\n" % uniqueId)
# lines.append(" for(i%s=0; i%s<%s.length; i%s++) {\n" % (uniqueId, uniqueId, srcCVariable, uniqueId))
# lines.extend([" " + x for x in self.Map(
# "%s.content[i%s]" % (srcCVariable, uniqueId),
# "%s.arr[i%s]" % (destVar, uniqueId),
# node._containedType,
# leafTypeDict,
# names)])
# lines.append(" }\n")
# lines.append(" %s.nCount = %s.length;\n" % (destVar, srcCVariable))
# lines.append("}\n")
# self.DecreaseUniqueID()
# return lines
# def MapSetOf(self, srcCVariable, destVar, node, leafTypeDict, names):
# return self.MapSequenceOf(srcCVariable, destVar, node, leafTypeDict, names)
#
# class FromASN1SCCtoDumpableC(RecursiveMapper):
# def __init__(self):
# self.uniqueID = 0
# def UniqueID(self):
# self.uniqueID += 1
# return self.uniqueID
# def DecreaseUniqueID(self):
# self.uniqueID -= 1
# def MapInteger(self, srcCVariable, destVar, _, __, ___):
# return ["%s = %s;\n" % (destVar, srcCVariable)]
# def MapReal(self, srcCVariable, destVar, _, __, ___):
# return ["%s = %s;\n" % (destVar, srcCVariable)]
# def MapBoolean(self, srcCVariable, destVar, _, __, ___):
# return ["%s = %s;\n" % (destVar, srcCVariable)]
# def MapOctetString(self, srcCVariable, destVar, _, __, ___):
# lines = []
# lines.append("{\n")
# lines.append(" int i;\n")
# lines.append(" for(i=0; i<%s.nCount; i++)\n" % srcCVariable)
# lines.append(" %s.content[i] = %s.arr[i];\n" % (destVar, srcCVariable))
# lines.append(" %s.length = %s.nCount;\n" % (destVar, srcCVariable))
# lines.append("}\n")
# return lines
# def MapEnumerated(self, srcCVariable, destVar, _, __, ___):
# return ["%s = %s;\n" % (destVar, srcCVariable)]
# def MapSequence(self, srcCVariable, destVar, node, leafTypeDict, names):
# lines = []
# for child in node._members:
# lines.extend(
# self.Map(
# "%s.%s" % (srcCVariable, self.CleanName(child[0])),
# destVar + "." + self.CleanName(child[0]),
# child[1],
# leafTypeDict,
# names))
# return lines
# def MapSet(self, srcCVariable, destVar, node, leafTypeDict, names):
# return self.MapSequence(srcCVariable, destVar, node, leafTypeDict, names)
# def MapChoice(self, srcCVariable, destVar, node, leafTypeDict, names):
# lines = []
# childNo = 0
# for child in node._members:
# childNo += 1
# lines.append("%sif (%s.kind == %s_PRESENT) {\n" %
# (self.maybeElse(childNo), srcCVariable, self.CleanName(child[0])))
# lines.extend([' '+x for x in self.Map(
# "%s.u.%s" % (srcCVariable, self.CleanName(child[0])),
# destVar + ".u." + self.CleanName(child[0]),
# child[1],
# leafTypeDict,
# names)])
# lines.append(" %s.choiceIdx = %d;\n" % (destVar, childNo))
# lines.append("}\n")
# return lines
# def MapSequenceOf(self, srcCVariable, destVar, node, leafTypeDict, names):
# lines = []
# lines.append("{\n")
# uniqueId = self.UniqueID()
# lines.append(" int i%s;\n" % uniqueId)
# lines.append(" for(i%s=0; i%s<%s.nCount; i%s++) {\n" % (uniqueId, uniqueId, srcCVariable, uniqueId))
# lines.extend([" " + x for x in self.Map(
# "%s.arr[i%s]" % (srcCVariable, uniqueId),
# "%s.content[i%s]" % (destVar, uniqueId),
# node._containedType,
# leafTypeDict,
# names)])
# lines.append(" }\n")
# lines.append(" %s.length = %s.nCount;\n" % (destVar, srcCVariable))
# lines.append("}\n")
# self.DecreaseUniqueID()
# return lines
# def MapSetOf(self, srcCVariable, destVar, node, leafTypeDict, names):
# return self.MapSequenceOf(srcCVariable, destVar, node, leafTypeDict, names)
#
# class Ada_GlueGenerator(ASynchronousToolGlueGenerator):
# def __init__(self):
# ASynchronousToolGlueGenerator.__init__(self)
# self.FromDumpableCtoASN1SCC = FromDumpableCtoASN1SCC()
# self.FromASN1SCCtoDumpableC = FromASN1SCCtoDumpableC()
# self.Ada_HeaderFile = None
# self.Ada_SourceFile = None
# self.definedTypes = {}
# def Version(self):
# print "Code generator: " + "$Id: ada_B_mapper.py 2382 2012-06-22 08:35:33Z ttsiodras $"
# def HeadersOnStartup(self, unused_asnFile, unused_outputDir, unused_maybeFVname):
# if self.useOSS:
# self.C_HeaderFile.write("#include \"%s.oss.h\" // OSS generated\n\n" % self.asn_name)
# self.C_SourceFile.write("\nextern OssGlobal *g_world;\n\n")
# self.C_HeaderFile.write("#include \"%s.h\" // Space certified compiler generated\n\n" % self.asn_name)
# self.C_HeaderFile.write("#include \"DumpableTypes.h\"\n\n")
# def Encoder(self, nodeTypename, node, leafTypeDict, names, encoding):
# if encoding.lower() not in self.supportedEncodings:
# panic(str(self.__class__) + ": in (%s), encoding can be one of %s (not '%s')" %
# (nodeTypename, self.supportedEncodings, encoding))
#
# # Definition of the standard encoding function (same interface as the C mapper )
# cBackend.Encoder(nodeTypename, node, leafTypeDict, names, encoding)
# # End standard encoding function
#
# # in order not to duplicate conversion functions, skip the rest if encoding is native
# if encoding.lower() == "native":
# return
#
# if not self.definedTypes.has_key(nodeTypename):
# self.definedTypes[nodeTypename] = 1
# # Declare/define the C stub variable (one per ASN.1 type)
# self.C_HeaderFile.write("\n/* --- Staging var for %s --- */\n" % (nodeTypename))
#
# tmpTypeName = "asn1Scc%s" % self.CleanNameAsToolWants(nodeTypename)
# tmpVarName = "asn1scc"
# tmpSpName = "Ada_to_SCC_%s" % \
# self.CleanNameAsToolWants(nodeTypename)
#
# self.C_HeaderFile.write(
# "void %s(GT__%s *ada, %s *%s);\n" %
# (tmpSpName,
# self.CleanNameAsToolWants(nodeTypename),
# tmpTypeName,
# tmpVarName))
# self.C_SourceFile.write(
# "void %s(GT__%s *ada, %s *%s)\n{\n" %
# (tmpSpName,
# self.CleanNameAsToolWants(nodeTypename),
# tmpTypeName,
# tmpVarName))
#
# lines = self.FromDumpableCtoASN1SCC.Map(
# "(*ada)",
# "(*asn1scc)",
# node,
# leafTypeDict,
# names)
# lines = [" "+x for x in lines]
#
# self.C_SourceFile.write("".join(lines))
# self.C_SourceFile.write("}\n\n")
#
# def Decoder(self, nodeTypename, node, leafTypeDict, names, encoding):
# if encoding.lower() not in self.supportedEncodings:
# panic(str(self.__class__) + ": in (%s), encoding can be one of %s (not '%s')" %
# (nodeTypename, self.supportedEncodings, encoding))
#
# # Definition of the standard decoding function (same interface as the C mapper )
# cBackend.Decoder(nodeTypename, node, leafTypeDict, names, encoding)
# # End standard decoding function
#
# if encoding.lower() == "native":
# return
#
# tmpTypeName = "asn1Scc%s" % self.CleanNameAsToolWants(nodeTypename)
# tmpVarName = "asn1scc"
# tmpSpName = "SCC_to_Ada_%s" % self.CleanNameAsToolWants(nodeTypename)
#
# # Create C function that does the encoding
# self.C_HeaderFile.write(
# "void %s(%s *%s, GT__%s *ada);\n" %
# (tmpSpName,
# tmpTypeName,
# tmpVarName,
# self.CleanNameAsToolWants(nodeTypename)))
# self.C_SourceFile.write(
# "void %s(%s *%s, GT__%s *ada)\n{\n" %
# (tmpSpName,
# tmpTypeName,
# tmpVarName,
# self.CleanNameAsToolWants(nodeTypename)))
#
# lines = self.FromASN1SCCtoDumpableC.Map(
# "(*asn1scc)",
# "(*ada)",
# node,
# leafTypeDict,
# names)
# lines = [" "+x for x in lines]
#
# self.C_SourceFile.write("".join(lines))
# self.C_SourceFile.write("}\n\n")
#
# def OnShutdown(self, modelingLanguage, asnFile, maybeFVname):
# ASynchronousToolGlueGenerator.OnShutdown(self, modelingLanguage, asnFile, maybeFVname)
def OnStartup(unused_modelingLanguage, asnFile, outputDir, maybeFVname, useOSS):
global cBackend
# 2009-02-10: Since we now use ASN1SCC structures as dumpables (even for Ada)
# we no longer need these Ada-specific Dumpable structures.
#global adaBackend
#adaBackend = Ada_GlueGenerator()
cBackend = c_B_mapper.C_GlueGenerator()
#adaBackend.OnStartup(modelingLanguage, asnFile, outputDir, maybeFVname, useOSS)
cBackend.OnStartup("C", asnFile, outputDir, maybeFVname, useOSS)
def OnBasic(nodeTypename, node, leafTypeDict, names):
cBackend.OnBasic(nodeTypename, node, leafTypeDict, names)
def OnSequence(nodeTypename, node, leafTypeDict, names):
cBackend.OnSequence(nodeTypename, node, leafTypeDict, names)
def OnSet(nodeTypename, node, leafTypeDict, names):
cBackend.OnSet(nodeTypename, node, leafTypeDict, names) # pragma: nocover
def OnEnumerated(nodeTypename, node, leafTypeDict, names):
cBackend.OnEnumerated(nodeTypename, node, leafTypeDict, names)
def OnSequenceOf(nodeTypename, node, leafTypeDict, names):
cBackend.OnSequenceOf(nodeTypename, node, leafTypeDict, names)
def OnSetOf(nodeTypename, node, leafTypeDict, names):
cBackend.OnSetOf(nodeTypename, node, leafTypeDict, names) # pragma: nocover
def OnChoice(nodeTypename, node, leafTypeDict, names):
cBackend.OnChoice(nodeTypename, node, leafTypeDict, names)
def OnShutdown(unused_modelingLanguage, asnFile, maybeFVname):
cBackend.OnShutdown("C", asnFile, maybeFVname)
og_B_mapper.py
\ No newline at end of file
This diff is collapsed.
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
asn2aadlPlus converts ASN.1 modules to AADL (v1 or v2) for use in TASTE
"""
from asn2aadlPlus import main
__version__ = 1.0
machine:
post:
- pyenv global 3.4.4 system
dependencies:
cache_directories:
- "~/.apt-cache"
pre:
- sudo rm -rf /var/cache/apt/archives && sudo ln -s ~/.apt-cache /var/cache/apt/archives && mkdir -p ~/.apt-cache/partial
- sudo apt-get update
- sudo apt-get install libxslt-dev libxml2-dev
- wget -O - -q https://github.com/ttsiodras/asn1scc/releases/download/3.2.81/asn1scc-bin-3.2.81.tar.gz | tar zxvf -
- wget -O - -q https://github.com/ttsiodras/DataModellingTools/files/335591/antlr-2.7.7.tar.gz | tar zxvf - ; cd antlr-2.7.7/lib/python ; sudo pip2 install .
- sudo apt-get install mono-complete
override:
- pip3 install -r requirements.txt
test:
override:
- PATH=$PATH:$(pwd)/asn1scc make
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
This module contains the shared API for parsing ASN.1
"""
import configMT
import asnParser
import asnAST
import utility
import createInternalTypes
import verify
import recursiveMapper
import cleanupNodes
__version__ = "1.1.0"
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
......@@ -2,9 +2,7 @@
# -*- coding: utf-8 -*-
"""
asn2dataModel converts ASN.1 modules to a variety of target languages
asn2dataModel converts ASN.1 modules to a variety of target languages
"""
from asn2dataModel import main
import msgPrinter
import msgPrinterASN1
__version__ = 1.0
from ..commonPy import __version__
......@@ -26,22 +26,26 @@ import os
import sys
import distutils.spawn as spawn
from commonPy.utility import panic
from typing import List
from ..commonPy.utility import panic
from ..commonPy.cleanupNodes import SetOfBadTypenames
from ..commonPy.asnAST import AsnBasicNode, AsnSequenceOrSet, AsnSequenceOrSetOf, AsnEnumerated, AsnChoice
from ..commonPy.asnParser import AST_Leaftypes
def Version():
print "Code generator: " + "$Id: ada_A_mapper.py 2382 2012-06-22 08:35:33Z ttsiodras $" # pragma: no cover
def Version() -> None:
print("Code generator: " + "$Id: ada_A_mapper.py 2382 2012-06-22 08:35:33Z ttsiodras $") # pragma: no cover
# Especially for the C mapper, since we need to pass the complete ASN.1 files list to ASN1SCC,
# the second param is not asnFile, it is asnFiles
def OnStartup(unused_modelingLanguage, asnFiles, outputDir):
#print "Use ASN1SCC to generate the structures for '%s'" % asnFile
def OnStartup(unused_modelingLanguage: str, asnFiles: List[str], outputDir: str, unused_badTypes: SetOfBadTypenames) -> None: # pylint: disable=invalid-sequence-index
# print "Use ASN1SCC to generate the structures for '%s'" % asnFile
asn1SccPath = spawn.find_executable('asn1.exe')
if not asn1SccPath:
panic("ASN1SCC seems not installed on your system (asn1.exe not found in PATH).\n")
panic("ASN1SCC seems to be missing from your system (asn1.exe not found in PATH).\n") # pragma: no cover
os.system(
("mono " if sys.argv[0].endswith('.py') and sys.platform.startswith('linux') else "") +
"\"{}\" -wordSize 8 -typePrefix asn1Scc -Ada -uPER -o \"".format(asn1SccPath) +
......@@ -49,33 +53,33 @@ def OnStartup(unused_modelingLanguage, asnFiles, outputDir):
os.system("rm -f \"" + outputDir + "\"/*.adb")
def OnBasic(nodeTypename, node, leafTypeDict):
def OnBasic(unused_nodeTypename: str, unused_node: AsnBasicNode, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnSequence(nodeTypename, node, leafTypeDict):
def OnSequence(unused_nodeTypename: str, unused_node: AsnSequenceOrSet, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnSet(nodeTypename, node, leafTypeDict):
def OnSet(unused_nodeTypename: str, unused_node: AsnSequenceOrSet, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnEnumerated(nodeTypename, node, leafTypeDict):
def OnEnumerated(unused_nodeTypename: str, unused_node: AsnEnumerated, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnSequenceOf(nodeTypename, node, leafTypeDict):
def OnSequenceOf(unused_nodeTypename: str, unused_node: AsnSequenceOrSetOf, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnSetOf(nodeTypename, node, leafTypeDict):
def OnSetOf(unused_nodeTypename: str, unused_node: AsnSequenceOrSetOf, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnChoice(nodeTypename, node, leafTypeDict):
def OnChoice(unused_nodeTypename: str, unused_node: AsnChoice, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnShutdown():
def OnShutdown(unused_badTypes: SetOfBadTypenames) -> None:
pass # pragma: no cover
......@@ -25,61 +25,65 @@ code generator A.'''
import os
import sys
import distutils.spawn as spawn
from typing import List
from commonPy.utility import panic
from ..commonPy.utility import panic
from ..commonPy.cleanupNodes import SetOfBadTypenames
from ..commonPy.asnAST import AsnBasicNode, AsnSequenceOrSet, AsnSequenceOrSetOf, AsnEnumerated, AsnChoice
from ..commonPy.asnParser import AST_Leaftypes
def Version():
print "Code generator: " + "$Id: c_A_mapper.py 2382 2012-06-22 08:35:33Z ttsiodras $" # pragma: no cover
def Version() -> None:
print("Code generator: " + "$Id: c_A_mapper.py 2382 2012-06-22 08:35:33Z ttsiodras $") # pragma: no cover
# Especially for the C mapper, since we need to pass the complete ASN.1 files list to ASN1SCC,
# the second param is not asnFile, it is asnFiles
def OnStartup(unused_modelingLanguage, asnFiles, outputDir):
#print "Use ASN1SCC to generate the structures for '%s'" % asnFile
def OnStartup(unused_modelingLanguage: str, asnFiles: List[str], outputDir: str, unused_badTypes: SetOfBadTypenames) -> None: # pylint: disable=invalid-sequence-index
# print "Use ASN1SCC to generate the structures for '%s'" % asnFile
asn1SccPath = spawn.find_executable('asn1.exe')
if not asn1SccPath:
panic("ASN1SCC seems not installed on your system (asn1.exe not found in PATH).\n")
panic("ASN1SCC seems to be missing from your system (asn1.exe not found in PATH).\n") # pragma: no cover
os.system(
("mono " if sys.argv[0].endswith('.py') and sys.platform.startswith('linux') else "") +
"\"{}\" -wordSize 8 -typePrefix asn1Scc -c -uPER -o \"".format(asn1SccPath) +
outputDir + "\" \"" + "\" \"".join(asnFiles) + "\"")
cmd = 'rm -f '
for i in ['real.c', 'asn1crt.c', 'acn.c', 'ber.c', 'xer.c']:
for i in ['real.c', 'asn1crt.c', 'acn.c']:
cmd += ' "' + outputDir + '"/' + i
os.system(cmd)
for tmp in asnFiles:
os.system("rm -f \"" + outputDir + os.sep + os.path.basename(os.path.splitext(tmp)[0]) + ".c\"")
def OnBasic(nodeTypename, node, leafTypeDict):
def OnBasic(unused_nodeTypename: str, unused_node: AsnBasicNode, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnSequence(nodeTypename, node, leafTypeDict):
def OnSequence(unused_nodeTypename: str, unused_node: AsnSequenceOrSet, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnSet(nodeTypename, node, leafTypeDict):
def OnSet(unused_nodeTypename: str, unused_node: AsnSequenceOrSet, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnEnumerated(nodeTypename, node, leafTypeDict):
def OnEnumerated(unused_nodeTypename: str, unused_node: AsnEnumerated, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnSequenceOf(nodeTypename, node, leafTypeDict):
def OnSequenceOf(unused_nodeTypename: str, unused_node: AsnSequenceOrSetOf, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnSetOf(nodeTypename, node, leafTypeDict):
def OnSetOf(unused_nodeTypename: str, unused_node: AsnSequenceOrSetOf, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnChoice(nodeTypename, node, leafTypeDict):
def OnChoice(unused_nodeTypename: str, unused_node: AsnChoice, unused_leafTypeDict: AST_Leaftypes) -> None:
pass # pragma: no cover
def OnShutdown():
def OnShutdown(unused_badTypes: SetOfBadTypenames) -> None:
pass # pragma: no cover
#!/usr/bin/env python2
import os
import sys
choices = []
enums = []
bEnum = False
''' Parse the ASN.1-generated header file and extract the constants used for
CHOICE determinants (#define det..._PRESENT) and ENUMERATED values
Emit strings that are appended to DV.py from Makefile.python
There is no name clash thanks to the rename policy of the ASN.1 compiler
(a constant cannot be defined twice)
'''
for line in open(sys.argv[1] + '.h', 'r'):
if '_PRESENT' in line and not line.startswith('#define'):
choices.append(line.strip().replace(",", ""))
elif line.strip().startswith('typedef enum {'):
bEnum = True
elif line.strip().startswith('}') and bEnum:
bEnum = False
elif bEnum:
enums.append(line.strip().replace(",", "").split("="))
enums_dump = "\n ".join(
'printf("%s = %%d\\n", %s);' % (e, e)
for e in choices
)
enums_dump += "\n ".join(
'printf("%s = %d\\n");' % (name.strip(), int(val))
for name, val in enums
)
uniq = os.getpid()
extractor_filename = "/tmp/enums_%d" % uniq
f = open(extractor_filename + ".c", 'w')
f.write("""
#include <stdio.h>
#include "%(base)s.h"
void main()
{
%(enums_dump)s
}""" % {"enums_dump": enums_dump, "base": sys.argv[1]})
f.close()
cmd = "gcc -o %s -I. %s.c" % (extractor_filename, extractor_filename)
if os.system(cmd) != 0:
print("Failed to extract CHOICE enum values...")
sys.exit(1)
os.system(extractor_filename)
os.unlink(extractor_filename + ".c")
os.unlink(extractor_filename)
from types import ModuleType
from typing import List, Union
from ..commonPy.asnAST import AsnNode # NOQA pylint: disable=unused-import
from ..commonPy.asnParser import Filename, Typename, AST_Lookup, AST_TypesOfFile, AST_Leaftypes # NOQA pylint: disable=unused-import
from ..commonPy.cleanupNodes import SetOfBadTypenames
Filename_Or_ListOfFilenames = Union[str, List[str]] # pylint: disable=invalid-sequence-index
class A_Mapper(ModuleType):
def OnStartup(
self, modelingLanguage: str, asnFile: Filename_Or_ListOfFilenames,
outputDir: str, badTypes: SetOfBadTypenames) -> None:
pass
def OnBasic(self, nodeTypename: str, node: AsnNode, leafTypeDict: AST_Leaftypes) -> None:
pass
def OnSequence(self, nodeTypename: str, node: AsnNode, leafTypeDict: AST_Leaftypes) -> None:
pass
def OnSet(self, nodeTypename: str, node: AsnNode, leafTypeDict: AST_Leaftypes) -> None:
pass
def OnChoice(self, nodeTypename: str, node: AsnNode, leafTypeDict: AST_Leaftypes) -> None:
pass
def OnSequenceOf(self, nodeTypename: str, node: AsnNode, leafTypeDict: AST_Leaftypes) -> None:
pass
def OnSetOf(self, nodeTypename: str, node: AsnNode, leafTypeDict: AST_Leaftypes) -> None:
pass
def OnEnumerated(self, nodeTypename: str, node: AsnNode, leafTypeDict: AST_Leaftypes) -> None:
pass
def OnShutdown(self, badTypes: SetOfBadTypenames) -> None:
pass
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
......@@ -4,6 +4,5 @@
"""
aadl2glueC : B mappers -generate code that convert from/to ASN1SCC
"""
from aadl2glueC import main
__version__ = 1.0
from ..commonPy import __version__
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
DMT - Data Modelling Technologies
"""
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
......@@ -48,7 +48,7 @@ import configMT
def inform(format, *args):
if configMT.verbose:
print format % args
print(format % args)
def warn(format, *args):
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.