Commit 588359dc authored by Thanassis Tsiodras's avatar Thanassis Tsiodras
Browse files

Merge remote-tracking branch 'esa-DMT/master'

parents bdf70668 369e0722
# image: $CI_REGISTRY_IMAGE/taste:latest
image: dmt:latest
variables:
GIT_SUBMODULE_STRATEGY: recursive
stages:
- build
- post_build
build:
stage: build
script:
- ./configure ; pip3 install --upgrade . ; LANG=C LC_ALL=C PATH=$PATH:/asn1scc make
# artifacts:
# paths:
# - 'test/logs/*.err.txt'
# when: on_failure
# expire_in: 2 weeks
PY_SRC:=$(wildcard dmt/asn2dataModel.py dmt/aadl2glueC.py dmt/smp2asn.py dmt/*mappers/[a-zA-Z]*py dmt/commonPy/[a-zA-Z]*py) PY_SRC:=$(wildcard dmt/asn2dataModel.py dmt/aadl2glueC.py dmt/smp2asn.py dmt/*mappers/[a-zA-Z]*py dmt/commonPy/[a-zA-Z]*py)
PY_SRC:=$(filter-out dmt/B_mappers/antlr.main.py dmt/A_mappers/Stubs.py, ${PY_SRC}) PY_SRC:=$(filter-out dmt/B_mappers/antlr.main.py dmt/A_mappers/Stubs.py dmt/B_mappers/micropython_async_B_mapper.py dmt/commonPy/commonSMP2.py, ${PY_SRC})
# Python3.5 includes an older version of typing, which by default has priority over # Python3.5 includes an older version of typing, which by default has priority over
# the one installed in $HOME/.local via setup.py. # the one installed in $HOME/.local via setup.py.
...@@ -25,7 +25,7 @@ flake8: ...@@ -25,7 +25,7 @@ flake8:
pylint: pylint:
@echo Performing static analysis via pylint... @echo Performing static analysis via pylint...
@pylint --disable=I --rcfile=pylint.cfg ${PY_SRC} | grep -v '^$$' | sed -n '/^Report/q;p' @pylint --disable=I --rcfile=pylint.cfg ${PY_SRC}
mypy: mypy:
@echo Performing type analysis via mypy... @echo Performing type analysis via mypy...
......
[![Build and Test Status of Data Modelling Tools on Circle CI](https://circleci.com/gh/ttsiodras/DataModellingTools.svg?&style=shield&circle-token=9df10d36b6b4ccd923415a5890155b7bf54b95c5)](https://circleci.com/gh/ttsiodras/DataModellingTools/tree/master) [![Build and Test Status of Data Modelling Tools on Gitlab CI](https://gitrepos.estec.esa.int/taste/dmt/badges/master/pipeline.svg)](https://gitrepos.estec.esa.int/taste/dmt/-/commits/master)
TASTE Data Modelling Tools TASTE Data Modelling Tools
========================== ==========================
These are the tools used by the European Space Agency's [TASTE toolchain](https://taste.tools/) These are the tools used by the European Space Agency's [TASTE toolchain](https://taste.tools/)
to automate handling of the Data Modelling. They include more than two to automate handling of the Data Modelling. They include more than two
dozen codegenerators that automatically create the 'glue'; the run-time translation dozen codegenerators that automatically create the 'glue'; that is, the run-time translation
bridges that allow code generated by modelling tools (Simulink, SCADE, OpenGeode, etc) bridges that allow code generated by modelling tools (Simulink, SCADE, OpenGeode, etc)
to "speak" to one another, via ASN.1 marshalling. to "speak" to one another, via ASN.1 marshalling.
...@@ -70,7 +70,7 @@ What is packaged: ...@@ -70,7 +70,7 @@ What is packaged:
Reads the AADL specification of the system, and then generates the runtime Reads the AADL specification of the system, and then generates the runtime
bridge-code that will map the message data structures from those generated bridge-code that will map the message data structures from those generated
by [ASN1SCC](https://github.com/ttsiodras/asn1scc) to/from those generated by [ASN1SCC](https://github.com/ttsiodras/asn1scc) to/from those generated
by the modeling tool (that is used to functionally model the subsystem - by the modeling tool (that is used to functionally model the subsystem;
e.g. SCADE, ObjectGeode, Matlab/Simulink, C, Ada, etc). e.g. SCADE, ObjectGeode, Matlab/Simulink, C, Ada, etc).
Contact Contact
...@@ -84,7 +84,7 @@ contact me at: ...@@ -84,7 +84,7 @@ contact me at:
System, Software and Technology Department System, Software and Technology Department
European Space Agency European Space Agency
ESTEC ESTEC / TEC-SWT
Keplerlaan 1, PO Box 299 Keplerlaan 1, PO Box 299
NL-2200 AG Noordwijk, The Netherlands NL-2200 AG Noordwijk, The Netherlands
Athanasios.Tsiodras@esa.int | www.esa.int Athanasios.Tsiodras@esa.int | www.esa.int
......
...@@ -547,7 +547,7 @@ def DumpTypeDumper( ...@@ -547,7 +547,7 @@ def DumpTypeDumper(
lines.append(codeIndent + "state = self.GetState()") lines.append(codeIndent + "state = self.GetState()")
lines.append(codeIndent + "length = %s.GetLength()" % variableName) lines.append(codeIndent + "length = %s.GetLength()" % variableName)
lines.append(codeIndent + "self.Reset(state)") lines.append(codeIndent + "self.Reset(state)")
lines.append(codeIndent + "map(partial(emitElem, %s), range(length))" % variableName) lines.append(codeIndent + "list(map(partial(emitElem, %s), range(length)))" % variableName)
lines.append(codeIndent + 'self.Reset(state)') lines.append(codeIndent + 'self.Reset(state)')
lines.append(codeIndent + 'lines.append("}")') lines.append(codeIndent + 'lines.append("}")')
......
...@@ -158,9 +158,9 @@ class FromASN1SCCtoSimulink(RecursiveMapper): ...@@ -158,9 +158,9 @@ class FromASN1SCCtoSimulink(RecursiveMapper):
lines = [] # type: List[str] lines = [] # type: List[str]
limit = sourceSequenceLimit(node, srcVar) limit = sourceSequenceLimit(node, srcVar)
for i in range(0, node._range[-1]): lines.append("unsigned int i=0;\n")
lines.append("if (%s>=%d) %s.element_data[%d] = %s.arr[%d]; else %s.element_data[%d] = 0;\n" % lines.append("for(i=0; i<%s; i++)\n %s.element_data[i] = %s.arr[i];\n" % (limit, dstSimulink, srcVar))
(limit, i + 1, dstSimulink, i, srcVar, i, dstSimulink, i))
if len(node._range) > 1 and node._range[0] != node._range[1]: if len(node._range) > 1 and node._range[0] != node._range[1]:
lines.append("%s.length = %s;\n" % (dstSimulink, limit)) lines.append("%s.length = %s;\n" % (dstSimulink, limit))
return lines return lines
......
...@@ -121,7 +121,7 @@ def calculateForNativeAndASN1SCC(absASN1SCCpath, autosrc, names, inputFiles): ...@@ -121,7 +121,7 @@ def calculateForNativeAndASN1SCC(absASN1SCCpath, autosrc, names, inputFiles):
cleaned = cleanNameAsAsn1cWants(asnTypename) cleaned = cleanNameAsAsn1cWants(asnTypename)
msgEncoderFile.write('static asn1Scc%s sizeof_%s;\n' % (cleaned, cleaned)) msgEncoderFile.write('static asn1Scc%s sizeof_%s;\n' % (cleaned, cleaned))
msgEncoderFile.write('char bytesEncoding_%s[asn1Scc%s_REQUIRED_BYTES_FOR_ENCODING];\n' % (cleaned, cleaned)) msgEncoderFile.write('char bytesEncoding_%s[asn1Scc%s_REQUIRED_BYTES_FOR_ENCODING];\n' % (cleaned, cleaned))
if acn != "": if acn != "" and node.hasAcnEncDec:
msgEncoderFile.write('char bytesAcnEncoding_%s[asn1Scc%s_REQUIRED_BYTES_FOR_ACN_ENCODING];\n' % (cleaned, cleaned)) msgEncoderFile.write('char bytesAcnEncoding_%s[asn1Scc%s_REQUIRED_BYTES_FOR_ACN_ENCODING];\n' % (cleaned, cleaned))
msgEncoderFile.close() msgEncoderFile.close()
...@@ -290,7 +290,12 @@ def main(): ...@@ -290,7 +290,12 @@ def main():
panic("'%s' is not a file!\n" % x) panic("'%s' is not a file!\n" % x)
aadlFile = args[-1] aadlFile = args[-1]
inputFiles = [os.path.abspath(x) for x in args[:-1]] inputFiles = [
os.path.relpath(x)
if 'tool-inst' not in x and not x.startswith('/tmp')
else x
for x in args[:-1]
]
def md5(filename): def md5(filename):
hash_md5 = hashlib.md5() hash_md5 = hashlib.md5()
...@@ -333,7 +338,7 @@ def main(): ...@@ -333,7 +338,7 @@ def main():
# Parse the ASN.1 files (skip the ACN ones) # Parse the ASN.1 files (skip the ACN ones)
asnFiles = [x for x in inputFiles if not x.lower().endswith('.acn')] asnFiles = [x for x in inputFiles if not x.lower().endswith('.acn')]
asnParser.ParseAsnFileList(asnFiles) asnParser.ParseAsnFileList(inputFiles)
autosrc = tempfile.mkdtemp(".asn1c") autosrc = tempfile.mkdtemp(".asn1c")
inform("Created temporary directory (%s) for auto-generated files...", autosrc) inform("Created temporary directory (%s) for auto-generated files...", autosrc)
absPathOfAADLfile = os.path.abspath(aadlFile) absPathOfAADLfile = os.path.abspath(aadlFile)
......
...@@ -113,6 +113,7 @@ class AsnNode: ...@@ -113,6 +113,7 @@ class AsnNode:
self._asnFilename = asnFilename self._asnFilename = asnFilename
self._lineno = -1 self._lineno = -1
self._isArtificial = False self._isArtificial = False
self.hasAcnEncDec = True
def Location(self) -> str: def Location(self) -> str:
return "file %s, line %d" % (self._asnFilename, int(self._lineno)) # pragma: no cover return "file %s, line %d" % (self._asnFilename, int(self._lineno)) # pragma: no cover
......
...@@ -403,7 +403,8 @@ def ParseAsnFileList(listOfFilenames: List[str]) -> None: # pylint: disable=inv ...@@ -403,7 +403,8 @@ def ParseAsnFileList(listOfFilenames: List[str]) -> None: # pylint: disable=inv
if projectCache is not None: if projectCache is not None:
filehash = hashlib.md5() filehash = hashlib.md5()
for each in sorted(listOfFilenames): for each in sorted(listOfFilenames):
filehash.update(open(each).read().encode('utf-8')) filehash.update(
open(each, "r", encoding="utf-8").read().encode('utf-8'))
# also hash the file path: it is used in the AST in XML, so it is # also hash the file path: it is used in the AST in XML, so it is
# not enough to hash the content of the ASN.1 files, as two sets # not enough to hash the content of the ASN.1 files, as two sets
# of files may have the same hash, that would lead to different XML # of files may have the same hash, that would lead to different XML
...@@ -811,6 +812,8 @@ def VisitTypeAssignment(newModule: Module, xmlTypeAssignment: Element) -> Tuple[ ...@@ -811,6 +812,8 @@ def VisitTypeAssignment(newModule: Module, xmlTypeAssignment: Element) -> Tuple[
newNode._isArtificial = isArtificial == "True" newNode._isArtificial = isArtificial == "True"
name = GetAttr(xmlTypeAssignment, "Name") name = GetAttr(xmlTypeAssignment, "Name")
g_adaUses.setdefault(newModule._id, set()).add(name) g_adaUses.setdefault(newModule._id, set()).add(name)
hasAcnEncDec = GetAttr(xmlType, "HasAcnEncDecFunction") or "False"
newNode.hasAcnEncDec = hasAcnEncDec != "False"
return (name, newNode) return (name, newNode)
......
...@@ -126,7 +126,7 @@ class Matcher: ...@@ -126,7 +126,7 @@ class Matcher:
self._lastOne = 'Search' self._lastOne = 'Search'
return self._search return self._search
def group(self, idx: int) -> str: def group(self, idx: int) -> str: # pylint: disable=inconsistent-return-statements
if self._lastOne == 'Match': if self._lastOne == 'Match':
return self._match.group(idx) return self._match.group(idx)
elif self._lastOne == 'Search': elif self._lastOne == 'Search':
...@@ -136,7 +136,7 @@ class Matcher: ...@@ -136,7 +136,7 @@ class Matcher:
"Matcher group called with index " "Matcher group called with index "
"%d before match/search!\n" % idx) "%d before match/search!\n" % idx)
def groups(self) -> Any: def groups(self) -> Any: # pylint: disable=inconsistent-return-statements
if self._lastOne == 'Match': if self._lastOne == 'Match':
return self._match.groups() return self._match.groups()
elif self._lastOne == 'Search': elif self._lastOne == 'Search':
......
...@@ -60,7 +60,7 @@ confidence= ...@@ -60,7 +60,7 @@ confidence=
# --enable=similarities". If you want to run only the classes checker, but have # --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes # no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W" # --disable=W"
disable=coerce-method,nonzero-method,buffer-builtin,unichr-builtin,reload-builtin,using-cmp-argument,reduce-builtin,filter-builtin-not-iterating,zip-builtin-not-iterating,raising-string,long-builtin,backtick,long-suffix,delslice-method,suppressed-message,cmp-method,old-octal-literal,basestring-builtin,metaclass-assignment,print-statement,execfile-builtin,round-builtin,oct-method,standarderror-builtin,hex-method,import-star-module-level,indexing-exception,map-builtin-not-iterating,old-ne-operator,setslice-method,input-builtin,apply-builtin,range-builtin-not-iterating,xrange-builtin,parameter-unpacking,no-absolute-import,old-raise-syntax,dict-iter-method,unicode-builtin,unpacking-in-except,old-division,file-builtin,next-method-called,useless-suppression,raw_input-builtin,intern-builtin,getslice-method,dict-view-method,cmp-builtin,coerce-builtin,line-too-long,missing-docstring,protected-access,global-statement,too-many-arguments,too-many-branches,too-many-locals,bare-except,invalid-name,too-many-statements,broad-except,too-many-instance-attributes,too-many-public-methods,too-few-public-methods,similarities,no-else-return,fixme disable=coerce-method,nonzero-method,buffer-builtin,unichr-builtin,reload-builtin,using-cmp-argument,reduce-builtin,filter-builtin-not-iterating,zip-builtin-not-iterating,raising-string,long-builtin,backtick,long-suffix,delslice-method,suppressed-message,cmp-method,old-octal-literal,basestring-builtin,metaclass-assignment,print-statement,execfile-builtin,round-builtin,oct-method,standarderror-builtin,hex-method,import-star-module-level,indexing-exception,map-builtin-not-iterating,old-ne-operator,setslice-method,input-builtin,apply-builtin,range-builtin-not-iterating,xrange-builtin,parameter-unpacking,no-absolute-import,old-raise-syntax,dict-iter-method,unicode-builtin,unpacking-in-except,old-division,file-builtin,next-method-called,useless-suppression,raw_input-builtin,intern-builtin,getslice-method,dict-view-method,cmp-builtin,coerce-builtin,line-too-long,missing-docstring,protected-access,global-statement,too-many-arguments,too-many-branches,too-many-locals,bare-except,invalid-name,too-many-statements,broad-except,too-many-instance-attributes,too-many-public-methods,too-few-public-methods,similarities,no-else-return,fixme,relative-beyond-top-level,import-outside-toplevel
never-returning-functions=dmt.commonPy.utility.panic,sys.exit never-returning-functions=dmt.commonPy.utility.panic,sys.exit
......
coverage>=3.7.1 coverage>=3.7.1
flake8==2.6.0 flake8==3.7.9
mypy==0.530 mypy==0.530
pyflakes>=1.2.3 pyflakes>=1.2.3
pylint>=1.7.0 pylint>=1.7.0
...@@ -7,5 +7,6 @@ pytest>=2.6.3 ...@@ -7,5 +7,6 @@ pytest>=2.6.3
lxml>=3.6.0 lxml>=3.6.0
astroid>=1.4.6 astroid>=1.4.6
pycodestyle>=2.0.0 pycodestyle>=2.0.0
typing>=3.5.2.2 typing==3.5.2.2
typing-extensions
mypy-extensions>=0.3.0 mypy-extensions>=0.3.0
...@@ -29,7 +29,7 @@ setup( ...@@ -29,7 +29,7 @@ setup(
'coverage>=3.7.1', 'coverage>=3.7.1',
'pytest>=2.6.3', 'pytest>=2.6.3',
'pycodestyle>=2.0.0', 'pycodestyle>=2.0.0',
'typing>=3.5.2.2', 'typing==3.5.2.2',
'mypy-extensions>=0.3.0', 'mypy-extensions>=0.3.0',
], ],
entry_points={ entry_points={
......
.PHONY: M2M M2C clean .PHONY: M2M M2C clean
all: M2M M2C SMP2 all: M2M M2C
M2M: M2M:
$(MAKE) -f Makefile.M2M clean $(MAKE) -f Makefile.M2M clean
......
...@@ -31,7 +31,7 @@ class CompleteTestingOfSQLMapperWithSQLite(AllTests, unittest.TestCase): ...@@ -31,7 +31,7 @@ class CompleteTestingOfSQLMapperWithSQLite(AllTests, unittest.TestCase):
class CompleteTestingOfSQLMapperWithPostgreSQL(AllTests, unittest.TestCase): class CompleteTestingOfSQLMapperWithPostgreSQL(AllTests, unittest.TestCase):
#engine = create_engine('sqlite:///:memory:', echo=True) #engine = create_engine('sqlite:///:memory:', echo=True)
if os.getenv('CIRCLECI') is None: if os.getenv('CIRCLECI') is None:
dburi = 'postgresql+psycopg2://ubuntu:tastedb@localhost/circle_test' dburi = 'postgresql+psycopg2://taste:tastedb@localhost/circle_test'
else: else:
dburi = 'postgresql+psycopg2://ubuntu:@localhost/circle_test' dburi = 'postgresql+psycopg2://ubuntu:@localhost/circle_test'
engine = create_engine(dburi, echo=False) engine = create_engine(dburi, echo=False)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment