Merged in POCONSOLE-33-Docker-Container (pull request #2)
POCONSOLE-33 docker container
3
.gitmodules
vendored
@@ -1,3 +0,0 @@
|
||||
[submodule "tag"]
|
||||
path = tag
|
||||
url=ssh://git-codecommit.us-east-1.amazonaws.com/v1/repos/POCONSOLE-Tag
|
||||
|
||||
@@ -1,10 +0,0 @@
|
||||
{
|
||||
"transport": "ftp",
|
||||
"uploadOnSave": false,
|
||||
"useAtomicWrites": false,
|
||||
"deleteLocal": false,
|
||||
"ignore": [
|
||||
".remote-sync.json",
|
||||
".git/**"
|
||||
]
|
||||
}
|
||||
BIN
TagServer.mwb
14
daq/Dockerfile.rpi
Normal file
@@ -0,0 +1,14 @@
|
||||
FROM patrickjmcd/rpi-python3:latest
|
||||
|
||||
# Copy source files
|
||||
RUN mkdir /root/tag-logger
|
||||
COPY taglogger.py /root/tag-logger/taglogger.py
|
||||
COPY pycomm-master /tmp/pycomm
|
||||
COPY pycomm_helper /tmp/pycomm_helper
|
||||
|
||||
# Install some python packages
|
||||
RUN pip install requests
|
||||
RUN cd /tmp/pycomm && python setup.py install && cd /
|
||||
RUN cd /tmp/pycomm_helper && python setup.py install && cd /
|
||||
|
||||
CMD ["python", "/root/tag-logger/taglogger.py"]
|
||||
14
daq/Dockerfile.ubuntu
Normal file
@@ -0,0 +1,14 @@
|
||||
FROM python:latest
|
||||
|
||||
# Copy source files
|
||||
RUN mkdir /root/tag-logger
|
||||
COPY taglogger.py /root/tag-logger/taglogger.py
|
||||
COPY pycomm-master /tmp/pycomm
|
||||
COPY pycomm_helper /tmp/pycomm_helper
|
||||
|
||||
# Install some python packages
|
||||
RUN pip install requests
|
||||
RUN cd /tmp/pycomm && python setup.py install && cd /
|
||||
RUN cd /tmp/pycomm_helper && python setup.py install && cd /
|
||||
|
||||
CMD ["python", "/root/tag-logger/taglogger.py"]
|
||||
12
daq/pycomm-master/.travis.yml
Executable file
@@ -0,0 +1,12 @@
|
||||
language: python
|
||||
|
||||
python:
|
||||
- "2.6"
|
||||
- "2.7"
|
||||
- "3.2"
|
||||
- "3.3"
|
||||
- "3.4"
|
||||
|
||||
install: python setup.py install
|
||||
|
||||
script: nosetests
|
||||
39
daq/pycomm-master/CHANGES
Executable file
@@ -0,0 +1,39 @@
|
||||
CHANGES
|
||||
=======
|
||||
|
||||
1.0.8
|
||||
-----
|
||||
Number 0001:
|
||||
handling of raw values (hex) added to functions read_array and write_array: handling of raw values can be switched
|
||||
on/off with additional parameter
|
||||
|
||||
Number 0002:
|
||||
is a bugfix when reading the tag_list from a PLC. If one tag is of datatype bool and it is part of a bool
|
||||
array within an SINT, the tag type value contains also the bit position.
|
||||
|
||||
Number 0003:
|
||||
code is always logging into a file (pycomm.log) into working path. Code changed, so that it is possible to configure
|
||||
the logging from the main application.
|
||||
|
||||
|
||||
|
||||
1.0.6
|
||||
-----
|
||||
|
||||
- Pypi posting
|
||||
|
||||
1.0.0
|
||||
-----
|
||||
|
||||
- Add support for SLC and PLC/05 plc
|
||||
|
||||
0.2.0
|
||||
---
|
||||
|
||||
- Add CIP support class
|
||||
- Add support for ControlLogix PLC
|
||||
|
||||
0.1
|
||||
---
|
||||
|
||||
- Initial release.
|
||||
22
daq/pycomm-master/LICENSE
Executable file
@@ -0,0 +1,22 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014 Agostino Ruscito
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
1
daq/pycomm-master/MANIFEST.in
Executable file
@@ -0,0 +1 @@
|
||||
include README.rst
|
||||
171
daq/pycomm-master/README.rst
Executable file
@@ -0,0 +1,171 @@
|
||||
pycomm
|
||||
======
|
||||
pycomm is a package that includes a collection of modules used to communicate with PLCs.
|
||||
At the moment the first module in the package is ab_comm.
|
||||
|
||||
Test
|
||||
~~~~
|
||||
The library is currently test on Python 2.6, 2.7.
|
||||
|
||||
.. image:: https://travis-ci.org/ruscito/pycomm.svg?branch=master
|
||||
:target: https://travis-ci.org/ruscito/pycomm
|
||||
|
||||
Setup
|
||||
~~~~~
|
||||
The package can be installed from
|
||||
|
||||
GitHub:
|
||||
::
|
||||
|
||||
git clone https://github.com/ruscito/pycomm.git
|
||||
cd pycomm
|
||||
sudo python setup.py install
|
||||
|
||||
|
||||
PyPi:
|
||||
::
|
||||
pip install pycomm
|
||||
|
||||
ab_comm
|
||||
~~~~~~~
|
||||
ab_comm is a module that contains a set of classes used to interface Rockwell PLCs using Ethernet/IP protocol.
|
||||
The "clx" class can be used to communicate with Compactlogix, Controllogix PLCs
|
||||
The "slc" can be used to communicate with Micrologix or SLC PLCs
|
||||
|
||||
I tried to followCIP specifications volume 1 and 2 as well as `Rockwell Automation Publication 1756-PM020-EN-P - November 2012`_ .
|
||||
|
||||
.. _Rockwell Automation Publication 1756-PM020-EN-P - November 2012: http://literature.rockwellautomation.com/idc/groups/literature/documents/pm/1756-pm020_-en-p.pdf
|
||||
|
||||
See the following snippet for communication with a Controllogix PLC:
|
||||
|
||||
::
|
||||
|
||||
from pycomm.ab_comm.clx import Driver as ClxDriver
|
||||
import logging
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
logging.basicConfig(
|
||||
filename="ClxDriver.log",
|
||||
format="%(levelname)-10s %(asctime)s %(message)s",
|
||||
level=logging.DEBUG
|
||||
)
|
||||
c = ClxDriver()
|
||||
|
||||
if c.open('172.16.2.161'):
|
||||
|
||||
print(c.read_tag(['ControlWord']))
|
||||
print(c.read_tag(['parts', 'ControlWord', 'Counts']))
|
||||
|
||||
print(c.write_tag('Counts', -26, 'INT'))
|
||||
print(c.write_tag(('Counts', 26, 'INT')))
|
||||
print(c.write_tag([('Counts', 26, 'INT')]))
|
||||
print(c.write_tag([('Counts', -26, 'INT'), ('ControlWord', -30, 'DINT'), ('parts', 31, 'DINT')]))
|
||||
|
||||
# To read an array
|
||||
r_array = c.read_array("TotalCount", 1750)
|
||||
for tag in r_array:
|
||||
print (tag)
|
||||
|
||||
# reset tha array to all 0
|
||||
w_array = []
|
||||
for i in xrange(1750):
|
||||
w_array.append(0)
|
||||
c.write_array("TotalCount", "SINT", w_array)
|
||||
|
||||
c.close()
|
||||
|
||||
|
||||
|
||||
|
||||
See the following snippet for communication with a Micrologix PLC:
|
||||
|
||||
|
||||
::
|
||||
|
||||
from pycomm.ab_comm.slc import Driver as SlcDriver
|
||||
import logging
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
logging.basicConfig(
|
||||
filename="SlcDriver.log",
|
||||
format="%(levelname)-10s %(asctime)s %(message)s",
|
||||
level=logging.DEBUG
|
||||
)
|
||||
c = SlcDriver()
|
||||
if c.open('172.16.2.160'):
|
||||
|
||||
print c.read_tag('S:1/5')
|
||||
print c.read_tag('S:60', 2)
|
||||
|
||||
print c.write_tag('N7:0', [-30, 32767, -32767])
|
||||
print c.write_tag('N7:0', 21)
|
||||
print c.read_tag('N7:0', 10)
|
||||
|
||||
print c.write_tag('F8:0', [3.1, 4.95, -32.89])
|
||||
print c.write_tag('F8:0', 21)
|
||||
print c.read_tag('F8:0', 3)
|
||||
|
||||
print c.write_tag('B3:100', [23, -1, 4, 9])
|
||||
print c.write_tag('B3:100', 21)
|
||||
print c.read_tag('B3:100', 4)
|
||||
|
||||
print c.write_tag('T4:3.PRE', 431)
|
||||
print c.read_tag('T4:3.PRE')
|
||||
print c.write_tag('C5:0.PRE', 501)
|
||||
print c.read_tag('C5:0.PRE')
|
||||
print c.write_tag('T4:3.ACC', 432)
|
||||
print c.read_tag('T4:3.ACC')
|
||||
print c.write_tag('C5:0.ACC', 502)
|
||||
print c.read_tag('C5:0.ACC')
|
||||
|
||||
c.write_tag('T4:2.EN', 0)
|
||||
c.write_tag('T4:2.TT', 0)
|
||||
c.write_tag('T4:2.DN', 0)
|
||||
print c.read_tag('T4:2.EN', 1)
|
||||
print c.read_tag('T4:2.TT', 1)
|
||||
print c.read_tag('T4:2.DN',)
|
||||
|
||||
c.write_tag('C5:0.CU', 1)
|
||||
c.write_tag('C5:0.CD', 0)
|
||||
c.write_tag('C5:0.DN', 1)
|
||||
c.write_tag('C5:0.OV', 0)
|
||||
c.write_tag('C5:0.UN', 1)
|
||||
c.write_tag('C5:0.UA', 0)
|
||||
print c.read_tag('C5:0.CU')
|
||||
print c.read_tag('C5:0.CD')
|
||||
print c.read_tag('C5:0.DN')
|
||||
print c.read_tag('C5:0.OV')
|
||||
print c.read_tag('C5:0.UN')
|
||||
print c.read_tag('C5:0.UA')
|
||||
|
||||
c.write_tag('B3:100', 1)
|
||||
print c.read_tag('B3:100')
|
||||
|
||||
c.write_tag('B3/3955', 1)
|
||||
print c.read_tag('B3/3955')
|
||||
|
||||
c.write_tag('N7:0/2', 1)
|
||||
print c.read_tag('N7:0/2')
|
||||
|
||||
print c.write_tag('O:0.0/4', 1)
|
||||
print c.read_tag('O:0.0/4')
|
||||
|
||||
c.close()
|
||||
|
||||
|
||||
The Future
|
||||
~~~~~~~~~~
|
||||
This package is under development.
|
||||
The modules _ab_comm.clx_ and _ab_comm.slc_ are completed at moment but other drivers will be added in the future.
|
||||
|
||||
Thanks
|
||||
~~~~~~
|
||||
Thanks to patrickjmcd_ for the help with the Direct Connections and thanks in advance to anyone for feedback and suggestions.
|
||||
|
||||
.. _patrickjmcd: https://github.com/patrickjmcd
|
||||
|
||||
License
|
||||
~~~~~~~
|
||||
pycomm is distributed under the MIT License
|
||||
42
daq/pycomm-master/examples/test_clx_comm.py
Executable file
@@ -0,0 +1,42 @@
|
||||
from pycomm.ab_comm.clx import Driver as ClxDriver
|
||||
import logging
|
||||
|
||||
from time import sleep
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
logging.basicConfig(
|
||||
filename="ClxDriver.log",
|
||||
format="%(levelname)-10s %(asctime)s %(message)s",
|
||||
level=logging.DEBUG
|
||||
)
|
||||
c = ClxDriver()
|
||||
|
||||
print c['port']
|
||||
print c.__version__
|
||||
|
||||
|
||||
if c.open('172.16.2.161'):
|
||||
while 1:
|
||||
try:
|
||||
print(c.read_tag(['ControlWord']))
|
||||
print(c.read_tag(['parts', 'ControlWord', 'Counts']))
|
||||
|
||||
print(c.write_tag('Counts', -26, 'INT'))
|
||||
print(c.write_tag(('Counts', 26, 'INT')))
|
||||
print(c.write_tag([('Counts', 26, 'INT')]))
|
||||
print(c.write_tag([('Counts', -26, 'INT'), ('ControlWord', -30, 'DINT'), ('parts', 31, 'DINT')]))
|
||||
sleep(1)
|
||||
except Exception as e:
|
||||
err = c.get_status()
|
||||
c.close()
|
||||
print err
|
||||
pass
|
||||
|
||||
# To read an array
|
||||
r_array = c.read_array("TotalCount", 1750)
|
||||
for tag in r_array:
|
||||
print (tag)
|
||||
|
||||
c.close()
|
||||
72
daq/pycomm-master/examples/test_slc_only.py
Executable file
@@ -0,0 +1,72 @@
|
||||
__author__ = 'agostino'
|
||||
|
||||
from pycomm.ab_comm.slc import Driver as SlcDriver
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
c = SlcDriver(True, 'delete_slc.log')
|
||||
if c.open('172.16.2.160'):
|
||||
|
||||
while 1:
|
||||
try:
|
||||
print c.read_tag('S:1/5')
|
||||
print c.read_tag('S:60', 2)
|
||||
|
||||
print c.write_tag('N7:0', [-30, 32767, -32767])
|
||||
print c.write_tag('N7:0', 21)
|
||||
print c.read_tag('N7:0', 10)
|
||||
|
||||
print c.write_tag('F8:0', [3.1, 4.95, -32.89])
|
||||
print c.write_tag('F8:0', 21)
|
||||
print c.read_tag('F8:0', 3)
|
||||
|
||||
print c.write_tag('B3:100', [23, -1, 4, 9])
|
||||
print c.write_tag('B3:100', 21)
|
||||
print c.read_tag('B3:100', 4)
|
||||
|
||||
print c.write_tag('T4:3.PRE', 431)
|
||||
print c.read_tag('T4:3.PRE')
|
||||
print c.write_tag('C5:0.PRE', 501)
|
||||
print c.read_tag('C5:0.PRE')
|
||||
print c.write_tag('T4:3.ACC', 432)
|
||||
print c.read_tag('T4:3.ACC')
|
||||
print c.write_tag('C5:0.ACC', 502)
|
||||
print c.read_tag('C5:0.ACC')
|
||||
|
||||
c.write_tag('T4:2.EN', 0)
|
||||
c.write_tag('T4:2.TT', 0)
|
||||
c.write_tag('T4:2.DN', 0)
|
||||
print c.read_tag('T4:2.EN', 1)
|
||||
print c.read_tag('T4:2.TT', 1)
|
||||
print c.read_tag('T4:2.DN',)
|
||||
|
||||
c.write_tag('C5:0.CU', 1)
|
||||
c.write_tag('C5:0.CD', 0)
|
||||
c.write_tag('C5:0.DN', 1)
|
||||
c.write_tag('C5:0.OV', 0)
|
||||
c.write_tag('C5:0.UN', 1)
|
||||
c.write_tag('C5:0.UA', 0)
|
||||
print c.read_tag('C5:0.CU')
|
||||
print c.read_tag('C5:0.CD')
|
||||
print c.read_tag('C5:0.DN')
|
||||
print c.read_tag('C5:0.OV')
|
||||
print c.read_tag('C5:0.UN')
|
||||
print c.read_tag('C5:0.UA')
|
||||
|
||||
c.write_tag('B3:100', 1)
|
||||
print c.read_tag('B3:100')
|
||||
|
||||
c.write_tag('B3/3955', 1)
|
||||
print c.read_tag('B3/3955')
|
||||
|
||||
c.write_tag('N7:0/2', 1)
|
||||
print c.read_tag('N7:0/2')
|
||||
|
||||
print c.write_tag('O:0.0/4', 1)
|
||||
print c.read_tag('O:0.0/4')
|
||||
except Exception as e:
|
||||
err = c.get_status()
|
||||
#c.close()
|
||||
print err
|
||||
pass
|
||||
c.close()
|
||||
1
daq/pycomm-master/pycomm/__init__.py
Executable file
@@ -0,0 +1 @@
|
||||
__author__ = 'agostino'
|
||||
2
daq/pycomm-master/pycomm/ab_comm/__init__.py
Executable file
@@ -0,0 +1,2 @@
|
||||
__author__ = 'agostino'
|
||||
import logging
|
||||
873
daq/pycomm-master/pycomm/ab_comm/clx.py
Executable file
@@ -0,0 +1,873 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# clx.py - Ethernet/IP Client for Rockwell PLCs
|
||||
#
|
||||
#
|
||||
# Copyright (c) 2014 Agostino Ruscito <ruscito@gmail.com>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in all
|
||||
# copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
# SOFTWARE.
|
||||
#
|
||||
from pycomm.cip.cip_base import *
|
||||
import logging
|
||||
try: # Python 2.7+
|
||||
from logging import NullHandler
|
||||
except ImportError:
|
||||
class NullHandler(logging.Handler):
|
||||
def emit(self, record):
|
||||
pass
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.addHandler(NullHandler())
|
||||
|
||||
|
||||
class Driver(Base):
|
||||
"""
|
||||
This Ethernet/IP client is based on Rockwell specification. Please refer to the link below for details.
|
||||
|
||||
http://literature.rockwellautomation.com/idc/groups/literature/documents/pm/1756-pm020_-en-p.pdf
|
||||
|
||||
The following services have been implemented:
|
||||
- Read Tag Service (0x4c)
|
||||
- Read Tag Fragment Service (0x52)
|
||||
- Write Tag Service (0x4d)
|
||||
- Write Tag Fragment Service (0x53)
|
||||
- Multiple Service Packet (0x0a)
|
||||
|
||||
The client has been successfully tested with the following PLCs:
|
||||
- CompactLogix 5330ERM
|
||||
- CompactLogix 5370
|
||||
- ControlLogix 5572 and 1756-EN2T Module
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
super(Driver, self).__init__()
|
||||
|
||||
self._buffer = {}
|
||||
self._get_template_in_progress = False
|
||||
self.__version__ = '0.2'
|
||||
|
||||
def get_last_tag_read(self):
|
||||
""" Return the last tag read by a multi request read
|
||||
|
||||
:return: A tuple (tag name, value, type)
|
||||
"""
|
||||
return self._last_tag_read
|
||||
|
||||
def get_last_tag_write(self):
|
||||
""" Return the last tag write by a multi request write
|
||||
|
||||
:return: A tuple (tag name, 'GOOD') if the write was successful otherwise (tag name, 'BAD')
|
||||
"""
|
||||
return self._last_tag_write
|
||||
|
||||
def _parse_instance_attribute_list(self, start_tag_ptr, status):
|
||||
""" extract the tags list from the message received
|
||||
|
||||
:param start_tag_ptr: The point in the message string where the tag list begin
|
||||
:param status: The status of the message receives
|
||||
"""
|
||||
tags_returned = self._reply[start_tag_ptr:]
|
||||
tags_returned_length = len(tags_returned)
|
||||
idx = 0
|
||||
instance = 0
|
||||
count = 0
|
||||
try:
|
||||
while idx < tags_returned_length:
|
||||
instance = unpack_dint(tags_returned[idx:idx+4])
|
||||
idx += 4
|
||||
tag_length = unpack_uint(tags_returned[idx:idx+2])
|
||||
idx += 2
|
||||
tag_name = tags_returned[idx:idx+tag_length]
|
||||
idx += tag_length
|
||||
symbol_type = unpack_uint(tags_returned[idx:idx+2])
|
||||
idx += 2
|
||||
count += 1
|
||||
self._tag_list.append({'instance_id': instance,
|
||||
'tag_name': tag_name,
|
||||
'symbol_type': symbol_type})
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
if status == SUCCESS:
|
||||
self._last_instance = -1
|
||||
elif status == 0x06:
|
||||
self._last_instance = instance + 1
|
||||
else:
|
||||
self._status = (1, 'unknown status during _parse_tag_list')
|
||||
self._last_instance = -1
|
||||
|
||||
def _parse_structure_makeup_attributes(self, start_tag_ptr, status):
|
||||
""" extract the tags list from the message received
|
||||
|
||||
:param start_tag_ptr: The point in the message string where the tag list begin
|
||||
:param status: The status of the message receives
|
||||
"""
|
||||
self._buffer = {}
|
||||
|
||||
if status != SUCCESS:
|
||||
self._buffer['Error'] = status
|
||||
return
|
||||
|
||||
attribute = self._reply[start_tag_ptr:]
|
||||
idx = 4
|
||||
try:
|
||||
if unpack_uint(attribute[idx:idx + 2]) == SUCCESS:
|
||||
idx += 2
|
||||
self._buffer['object_definition_size'] = unpack_dint(attribute[idx:idx + 4])
|
||||
else:
|
||||
self._buffer['Error'] = 'object_definition Error'
|
||||
return
|
||||
|
||||
idx += 6
|
||||
if unpack_uint(attribute[idx:idx + 2]) == SUCCESS:
|
||||
idx += 2
|
||||
self._buffer['structure_size'] = unpack_dint(attribute[idx:idx + 4])
|
||||
else:
|
||||
self._buffer['Error'] = 'structure Error'
|
||||
return
|
||||
|
||||
idx += 6
|
||||
if unpack_uint(attribute[idx:idx + 2]) == SUCCESS:
|
||||
idx += 2
|
||||
self._buffer['member_count'] = unpack_uint(attribute[idx:idx + 2])
|
||||
else:
|
||||
self._buffer['Error'] = 'member_count Error'
|
||||
return
|
||||
|
||||
idx += 4
|
||||
if unpack_uint(attribute[idx:idx + 2]) == SUCCESS:
|
||||
idx += 2
|
||||
self._buffer['structure_handle'] = unpack_uint(attribute[idx:idx + 2])
|
||||
else:
|
||||
self._buffer['Error'] = 'structure_handle Error'
|
||||
return
|
||||
|
||||
return self._buffer
|
||||
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _parse_template(self, start_tag_ptr, status):
|
||||
""" extract the tags list from the message received
|
||||
|
||||
:param start_tag_ptr: The point in the message string where the tag list begin
|
||||
:param status: The status of the message receives
|
||||
"""
|
||||
tags_returned = self._reply[start_tag_ptr:]
|
||||
bytes_received = len(tags_returned)
|
||||
|
||||
self._buffer += tags_returned
|
||||
|
||||
if status == SUCCESS:
|
||||
self._get_template_in_progress = False
|
||||
|
||||
elif status == 0x06:
|
||||
self._byte_offset += bytes_received
|
||||
else:
|
||||
self._status = (1, 'unknown status {0} during _parse_template'.format(status))
|
||||
logger.warning(self._status)
|
||||
self._last_instance = -1
|
||||
|
||||
def _parse_fragment(self, start_ptr, status):
|
||||
""" parse the fragment returned by a fragment service.
|
||||
|
||||
:param start_ptr: Where the fragment start within the replay
|
||||
:param status: status field used to decide if keep parsing or stop
|
||||
"""
|
||||
try:
|
||||
data_type = unpack_uint(self._reply[start_ptr:start_ptr+2])
|
||||
fragment_returned = self._reply[start_ptr+2:]
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
fragment_returned_length = len(fragment_returned)
|
||||
idx = 0
|
||||
|
||||
while idx < fragment_returned_length:
|
||||
try:
|
||||
typ = I_DATA_TYPE[data_type]
|
||||
if self._output_raw:
|
||||
value = fragment_returned[idx:idx+DATA_FUNCTION_SIZE[typ]]
|
||||
else:
|
||||
value = UNPACK_DATA_FUNCTION[typ](fragment_returned[idx:idx+DATA_FUNCTION_SIZE[typ]])
|
||||
idx += DATA_FUNCTION_SIZE[typ]
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
if self._output_raw:
|
||||
self._tag_list += value
|
||||
else:
|
||||
self._tag_list.append((self._last_position, value))
|
||||
self._last_position += 1
|
||||
|
||||
if status == SUCCESS:
|
||||
self._byte_offset = -1
|
||||
elif status == 0x06:
|
||||
self._byte_offset += fragment_returned_length
|
||||
else:
|
||||
self._status = (2, 'unknown status during _parse_fragment')
|
||||
self._byte_offset = -1
|
||||
|
||||
def _parse_multiple_request_read(self, tags):
|
||||
""" parse the message received from a multi request read:
|
||||
|
||||
For each tag parsed, the information extracted includes the tag name, the value read and the data type.
|
||||
Those information are appended to the tag list as tuple
|
||||
|
||||
:return: the tag list
|
||||
"""
|
||||
offset = 50
|
||||
position = 50
|
||||
try:
|
||||
number_of_service_replies = unpack_uint(self._reply[offset:offset+2])
|
||||
tag_list = []
|
||||
for index in range(number_of_service_replies):
|
||||
position += 2
|
||||
start = offset + unpack_uint(self._reply[position:position+2])
|
||||
general_status = unpack_usint(self._reply[start+2:start+3])
|
||||
|
||||
if general_status == 0:
|
||||
data_type = unpack_uint(self._reply[start+4:start+6])
|
||||
value_begin = start + 6
|
||||
value_end = value_begin + DATA_FUNCTION_SIZE[I_DATA_TYPE[data_type]]
|
||||
value = self._reply[value_begin:value_end]
|
||||
self._last_tag_read = (tags[index], UNPACK_DATA_FUNCTION[I_DATA_TYPE[data_type]](value),
|
||||
I_DATA_TYPE[data_type])
|
||||
else:
|
||||
self._last_tag_read = (tags[index], None, None)
|
||||
|
||||
tag_list.append(self._last_tag_read)
|
||||
|
||||
return tag_list
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _parse_multiple_request_write(self, tags):
|
||||
""" parse the message received from a multi request writ:
|
||||
|
||||
For each tag parsed, the information extracted includes the tag name and the status of the writing.
|
||||
Those information are appended to the tag list as tuple
|
||||
|
||||
:return: the tag list
|
||||
"""
|
||||
offset = 50
|
||||
position = 50
|
||||
try:
|
||||
number_of_service_replies = unpack_uint(self._reply[offset:offset+2])
|
||||
tag_list = []
|
||||
for index in range(number_of_service_replies):
|
||||
position += 2
|
||||
start = offset + unpack_uint(self._reply[position:position+2])
|
||||
general_status = unpack_usint(self._reply[start+2:start+3])
|
||||
|
||||
if general_status == 0:
|
||||
self._last_tag_write = (tags[index] + ('GOOD',))
|
||||
else:
|
||||
self._last_tag_write = (tags[index] + ('BAD',))
|
||||
|
||||
tag_list.append(self._last_tag_write)
|
||||
return tag_list
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _check_reply(self):
|
||||
""" check the replayed message for error
|
||||
|
||||
"""
|
||||
self._more_packets_available = False
|
||||
try:
|
||||
if self._reply is None:
|
||||
self._status = (3, '%s without reply' % REPLAY_INFO[unpack_dint(self._message[:2])])
|
||||
return False
|
||||
# Get the type of command
|
||||
typ = unpack_uint(self._reply[:2])
|
||||
|
||||
# Encapsulation status check
|
||||
if unpack_dint(self._reply[8:12]) != SUCCESS:
|
||||
self._status = (3, "{0} reply status:{1}".format(REPLAY_INFO[typ],
|
||||
SERVICE_STATUS[unpack_dint(self._reply[8:12])]))
|
||||
return False
|
||||
|
||||
# Command Specific Status check
|
||||
if typ == unpack_uint(ENCAPSULATION_COMMAND["send_rr_data"]):
|
||||
status = unpack_usint(self._reply[42:43])
|
||||
if status != SUCCESS:
|
||||
self._status = (3, "send_rr_data reply:{0} - Extend status:{1}".format(
|
||||
SERVICE_STATUS[status], get_extended_status(self._reply, 42)))
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
elif typ == unpack_uint(ENCAPSULATION_COMMAND["send_unit_data"]):
|
||||
status = unpack_usint(self._reply[48:49])
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Read Tag Fragmented"]:
|
||||
self._parse_fragment(50, status)
|
||||
return True
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Get Instance Attributes List"]:
|
||||
self._parse_instance_attribute_list(50, status)
|
||||
return True
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Get Attributes"]:
|
||||
self._parse_structure_makeup_attributes(50, status)
|
||||
return True
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Read Template"] and \
|
||||
self._get_template_in_progress:
|
||||
self._parse_template(50, status)
|
||||
return True
|
||||
if status == 0x06:
|
||||
self._status = (3, "Insufficient Packet Space")
|
||||
self._more_packets_available = True
|
||||
elif status != SUCCESS:
|
||||
self._status = (3, "send_unit_data reply:{0} - Extend status:{1}".format(
|
||||
SERVICE_STATUS[status], get_extended_status(self._reply, 48)))
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def read_tag(self, tag):
|
||||
""" read tag from a connected plc
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
- ('Counts') a single tag name
|
||||
- (['ControlWord']) a list with one tag or many
|
||||
- (['parts', 'ControlWord', 'Counts'])
|
||||
|
||||
At the moment there is not a strong validation for the argument passed. The user should verify
|
||||
the correctness of the format passed.
|
||||
|
||||
:return: None is returned in case of error otherwise the tag list is returned
|
||||
"""
|
||||
multi_requests = False
|
||||
if isinstance(tag, list):
|
||||
multi_requests = True
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (6, "Target did not connected. read_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. read_tag will not be executed.")
|
||||
|
||||
if multi_requests:
|
||||
rp_list = []
|
||||
for t in tag:
|
||||
rp = create_tag_rp(t, multi_requests=True)
|
||||
if rp is None:
|
||||
self._status = (6, "Cannot create tag {0} request packet. read_tag will not be executed.".format(tag))
|
||||
raise DataError("Cannot create tag {0} request packet. read_tag will not be executed.".format(tag))
|
||||
else:
|
||||
rp_list.append(chr(TAG_SERVICES_REQUEST['Read Tag']) + rp + pack_uint(1))
|
||||
message_request = build_multiple_service(rp_list, Base._get_sequence())
|
||||
|
||||
else:
|
||||
rp = create_tag_rp(tag)
|
||||
if rp is None:
|
||||
self._status = (6, "Cannot create tag {0} request packet. read_tag will not be executed.".format(tag))
|
||||
return None
|
||||
else:
|
||||
# Creating the Message Request Packet
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST['Read Tag']), # the Request Service
|
||||
chr(len(rp) / 2), # the Request Path Size length in word
|
||||
rp, # the request path
|
||||
pack_uint(1)
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
if multi_requests:
|
||||
return self._parse_multiple_request_read(tag)
|
||||
else:
|
||||
# Get the data type
|
||||
data_type = unpack_uint(self._reply[50:52])
|
||||
try:
|
||||
return UNPACK_DATA_FUNCTION[I_DATA_TYPE[data_type]](self._reply[52:]), I_DATA_TYPE[data_type]
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def read_array(self, tag, counts, raw=False):
|
||||
""" read array of atomic data type from a connected plc
|
||||
|
||||
At the moment there is not a strong validation for the argument passed. The user should verify
|
||||
the correctness of the format passed.
|
||||
|
||||
:param tag: the name of the tag to read
|
||||
:param counts: the number of element to read
|
||||
:param raw: the value should output as raw-value (hex)
|
||||
:return: None is returned in case of error otherwise the tag list is returned
|
||||
"""
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (7, "Target did not connected. read_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. read_tag will not be executed.")
|
||||
|
||||
self._byte_offset = 0
|
||||
self._last_position = 0
|
||||
self._output_raw = raw
|
||||
|
||||
if self._output_raw:
|
||||
self._tag_list = ''
|
||||
else:
|
||||
self._tag_list = []
|
||||
while self._byte_offset != -1:
|
||||
rp = create_tag_rp(tag)
|
||||
if rp is None:
|
||||
self._status = (7, "Cannot create tag {0} request packet. read_tag will not be executed.".format(tag))
|
||||
return None
|
||||
else:
|
||||
# Creating the Message Request Packet
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST["Read Tag Fragmented"]), # the Request Service
|
||||
chr(len(rp) / 2), # the Request Path Size length in word
|
||||
rp, # the request path
|
||||
pack_uint(counts),
|
||||
pack_dint(self._byte_offset)
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
return self._tag_list
|
||||
|
||||
def write_tag(self, tag, value=None, typ=None):
|
||||
""" write tag/tags from a connected plc
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
- ('tag name', Value, data type) as single parameters or inside a tuple
|
||||
- ([('tag name', Value, data type), ('tag name2', Value, data type)]) as array of tuples
|
||||
|
||||
At the moment there is not a strong validation for the argument passed. The user should verify
|
||||
the correctness of the format passed.
|
||||
|
||||
The type accepted are:
|
||||
- BOOL
|
||||
- SINT
|
||||
- INT'
|
||||
- DINT
|
||||
- REAL
|
||||
- LINT
|
||||
- BYTE
|
||||
- WORD
|
||||
- DWORD
|
||||
- LWORD
|
||||
|
||||
:param tag: tag name, or an array of tuple containing (tag name, value, data type)
|
||||
:param value: the value to write or none if tag is an array of tuple or a tuple
|
||||
:param typ: the type of the tag to write or none if tag is an array of tuple or a tuple
|
||||
:return: None is returned in case of error otherwise the tag list is returned
|
||||
"""
|
||||
multi_requests = False
|
||||
if isinstance(tag, list):
|
||||
multi_requests = True
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (8, "Target did not connected. write_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. write_tag will not be executed.")
|
||||
|
||||
if multi_requests:
|
||||
rp_list = []
|
||||
tag_to_remove = []
|
||||
idx = 0
|
||||
for name, value, typ in tag:
|
||||
# Create the request path to wrap the tag name
|
||||
rp = create_tag_rp(name, multi_requests=True)
|
||||
if rp is None:
|
||||
self._status = (8, "Cannot create tag{0} req. packet. write_tag will not be executed".format(tag))
|
||||
return None
|
||||
else:
|
||||
try: # Trying to add the rp to the request path list
|
||||
val = PACK_DATA_FUNCTION[typ](value)
|
||||
rp_list.append(
|
||||
chr(TAG_SERVICES_REQUEST['Write Tag'])
|
||||
+ rp
|
||||
+ pack_uint(S_DATA_TYPE[typ])
|
||||
+ pack_uint(1)
|
||||
+ val
|
||||
)
|
||||
idx += 1
|
||||
except (LookupError, struct.error) as e:
|
||||
self._status = (8, "Tag:{0} type:{1} removed from write list. Error:{2}.".format(name, typ, e))
|
||||
|
||||
# The tag in idx position need to be removed from the rp list because has some kind of error
|
||||
tag_to_remove.append(idx)
|
||||
|
||||
# Remove the tags that have not been inserted in the request path list
|
||||
for position in tag_to_remove:
|
||||
del tag[position]
|
||||
# Create the message request
|
||||
message_request = build_multiple_service(rp_list, Base._get_sequence())
|
||||
|
||||
else:
|
||||
if isinstance(tag, tuple):
|
||||
name, value, typ = tag
|
||||
else:
|
||||
name = tag
|
||||
|
||||
rp = create_tag_rp(name)
|
||||
if rp is None:
|
||||
self._status = (8, "Cannot create tag {0} request packet. write_tag will not be executed.".format(tag))
|
||||
logger.warning(self._status)
|
||||
return None
|
||||
else:
|
||||
# Creating the Message Request Packet
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST["Write Tag"]), # the Request Service
|
||||
chr(len(rp) / 2), # the Request Path Size length in word
|
||||
rp, # the request path
|
||||
pack_uint(S_DATA_TYPE[typ]), # data type
|
||||
pack_uint(1), # Add the number of tag to write
|
||||
PACK_DATA_FUNCTION[typ](value)
|
||||
]
|
||||
|
||||
ret_val = self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)
|
||||
)
|
||||
|
||||
if multi_requests:
|
||||
return self._parse_multiple_request_write(tag)
|
||||
else:
|
||||
if ret_val is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
return ret_val
|
||||
|
||||
def write_array(self, tag, data_type, values, raw=False):
|
||||
""" write array of atomic data type from a connected plc
|
||||
|
||||
At the moment there is not a strong validation for the argument passed. The user should verify
|
||||
the correctness of the format passed.
|
||||
|
||||
:param tag: the name of the tag to read
|
||||
:param data_type: the type of tag to write
|
||||
:param values: the array of values to write, if raw: the frame with bytes
|
||||
:param raw: indicates that the values are given as raw values (hex)
|
||||
"""
|
||||
if not isinstance(values, list):
|
||||
self._status = (9, "A list of tags must be passed to write_array.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("A list of tags must be passed to write_array.")
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (9, "Target did not connected. write_array will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. write_array will not be executed.")
|
||||
|
||||
array_of_values = ""
|
||||
byte_size = 0
|
||||
byte_offset = 0
|
||||
|
||||
for i, value in enumerate(values):
|
||||
if raw:
|
||||
array_of_values += value
|
||||
else:
|
||||
array_of_values += PACK_DATA_FUNCTION[data_type](value)
|
||||
byte_size += DATA_FUNCTION_SIZE[data_type]
|
||||
|
||||
if byte_size >= 450 or i == len(values)-1:
|
||||
# create the message and send the fragment
|
||||
rp = create_tag_rp(tag)
|
||||
if rp is None:
|
||||
self._status = (9, "Cannot create tag {0} request packet. \
|
||||
write_array will not be executed.".format(tag))
|
||||
return None
|
||||
else:
|
||||
# Creating the Message Request Packet
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST["Write Tag Fragmented"]), # the Request Service
|
||||
chr(len(rp) / 2), # the Request Path Size length in word
|
||||
rp, # the request path
|
||||
pack_uint(S_DATA_TYPE[data_type]), # Data type to write
|
||||
pack_uint(len(values)), # Number of elements to write
|
||||
pack_dint(byte_offset),
|
||||
array_of_values # Fragment of elements to write
|
||||
]
|
||||
byte_offset += byte_size
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
array_of_values = ""
|
||||
byte_size = 0
|
||||
|
||||
def _get_instance_attribute_list_service(self):
|
||||
""" Step 1: Finding user-created controller scope tags in a Logix5000 controller
|
||||
|
||||
This service returns instance IDs for each created instance of the symbol class, along with a list
|
||||
of the attribute data associated with the requested attribute
|
||||
"""
|
||||
try:
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (10, "Target did not connected. get_tag_list will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. get_tag_list will not be executed.")
|
||||
|
||||
self._last_instance = 0
|
||||
|
||||
self._get_template_in_progress = True
|
||||
while self._last_instance != -1:
|
||||
|
||||
# Creating the Message Request Packet
|
||||
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST['Get Instance Attributes List']), # STEP 1
|
||||
# the Request Path Size length in word
|
||||
chr(3),
|
||||
# Request Path ( 20 6B 25 00 Instance )
|
||||
CLASS_ID["8-bit"], # Class id = 20 from spec 0x20
|
||||
CLASS_CODE["Symbol Object"], # Logical segment: Symbolic Object 0x6B
|
||||
INSTANCE_ID["16-bit"], # Instance Segment: 16 Bit instance 0x25
|
||||
'\x00',
|
||||
pack_uint(self._last_instance), # The instance
|
||||
# Request Data
|
||||
pack_uint(2), # Number of attributes to retrieve
|
||||
pack_uint(1), # Attribute 1: Symbol name
|
||||
pack_uint(2) # Attribute 2: Symbol type
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
self._get_template_in_progress = False
|
||||
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _get_structure_makeup(self, instance_id):
|
||||
"""
|
||||
get the structure makeup for a specific structure
|
||||
"""
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (10, "Target did not connected. get_tag_list will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. get_tag_list will not be executed.")
|
||||
|
||||
message_request = [
|
||||
pack_uint(self._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST['Get Attributes']),
|
||||
chr(3), # Request Path ( 20 6B 25 00 Instance )
|
||||
CLASS_ID["8-bit"], # Class id = 20 from spec 0x20
|
||||
CLASS_CODE["Template Object"], # Logical segment: Template Object 0x6C
|
||||
INSTANCE_ID["16-bit"], # Instance Segment: 16 Bit instance 0x25
|
||||
'\x00',
|
||||
pack_uint(instance_id),
|
||||
pack_uint(4), # Number of attributes
|
||||
pack_uint(4), # Template Object Definition Size UDINT
|
||||
pack_uint(5), # Template Structure Size UDINT
|
||||
pack_uint(2), # Template Member Count UINT
|
||||
pack_uint(1) # Structure Handle We can use this to read and write UINT
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(DATA_ITEM['Connected'],
|
||||
''.join(message_request), ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
return self._buffer
|
||||
|
||||
def _read_template(self, instance_id, object_definition_size):
|
||||
""" get a list of the tags in the plc
|
||||
|
||||
"""
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (10, "Target did not connected. get_tag_list will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. get_tag_list will not be executed.")
|
||||
|
||||
self._byte_offset = 0
|
||||
self._buffer = ""
|
||||
self._get_template_in_progress = True
|
||||
|
||||
try:
|
||||
while self._get_template_in_progress:
|
||||
|
||||
# Creating the Message Request Packet
|
||||
|
||||
message_request = [
|
||||
pack_uint(self._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST['Read Template']),
|
||||
chr(3), # Request Path ( 20 6B 25 00 Instance )
|
||||
CLASS_ID["8-bit"], # Class id = 20 from spec 0x20
|
||||
CLASS_CODE["Template Object"], # Logical segment: Template Object 0x6C
|
||||
INSTANCE_ID["16-bit"], # Instance Segment: 16 Bit instance 0x25
|
||||
'\x00',
|
||||
pack_uint(instance_id),
|
||||
pack_dint(self._byte_offset), # Offset
|
||||
pack_uint(((object_definition_size * 4)-23) - self._byte_offset)
|
||||
]
|
||||
|
||||
if not self.send_unit_data(
|
||||
build_common_packet_format(DATA_ITEM['Connected'], ''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'], addr_data=self._target_cid,)):
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
self._get_template_in_progress = False
|
||||
return self._buffer
|
||||
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _isolating_user_tag(self):
|
||||
try:
|
||||
lst = self._tag_list
|
||||
self._tag_list = []
|
||||
for tag in lst:
|
||||
if tag['tag_name'].find(':') != -1 or tag['tag_name'].find('__') != -1:
|
||||
continue
|
||||
if tag['symbol_type'] & 0b0001000000000000:
|
||||
continue
|
||||
dimension = (tag['symbol_type'] & 0b0110000000000000) >> 13
|
||||
|
||||
if tag['symbol_type'] & 0b1000000000000000 :
|
||||
template_instance_id = tag['symbol_type'] & 0b0000111111111111
|
||||
tag_type = 'struct'
|
||||
data_type = 'user-created'
|
||||
self._tag_list.append({'instance_id': tag['instance_id'],
|
||||
'template_instance_id': template_instance_id,
|
||||
'tag_name': tag['tag_name'],
|
||||
'dim': dimension,
|
||||
'tag_type': tag_type,
|
||||
'data_type': data_type,
|
||||
'template': {},
|
||||
'udt': {}})
|
||||
else:
|
||||
tag_type = 'atomic'
|
||||
datatype = tag['symbol_type'] & 0b0000000011111111
|
||||
data_type = I_DATA_TYPE[datatype]
|
||||
if datatype == 0xc1:
|
||||
bit_position = (tag['symbol_type'] & 0b0000011100000000) >> 8
|
||||
self._tag_list.append({'instance_id': tag['instance_id'],
|
||||
'tag_name': tag['tag_name'],
|
||||
'dim': dimension,
|
||||
'tag_type': tag_type,
|
||||
'data_type': data_type,
|
||||
'bit_position' : bit_position})
|
||||
else:
|
||||
self._tag_list.append({'instance_id': tag['instance_id'],
|
||||
'tag_name': tag['tag_name'],
|
||||
'dim': dimension,
|
||||
'tag_type': tag_type,
|
||||
'data_type': data_type})
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _parse_udt_raw(self, tag):
|
||||
try:
|
||||
buff = self._read_template(tag['template_instance_id'], tag['template']['object_definition_size'])
|
||||
member_count = tag['template']['member_count']
|
||||
names = buff.split('\00')
|
||||
lst = []
|
||||
|
||||
tag['udt']['name'] = 'Not an user defined structure'
|
||||
for name in names:
|
||||
if len(name) > 1:
|
||||
|
||||
if name.find(';') != -1:
|
||||
tag['udt']['name'] = name[:name.find(';')]
|
||||
elif name.find('ZZZZZZZZZZ') != -1:
|
||||
continue
|
||||
elif name.isalpha():
|
||||
lst.append(name)
|
||||
else:
|
||||
continue
|
||||
tag['udt']['internal_tags'] = lst
|
||||
|
||||
type_list = []
|
||||
|
||||
for i in xrange(member_count):
|
||||
# skip member 1
|
||||
|
||||
if i != 0:
|
||||
array_size = unpack_uint(buff[:2])
|
||||
try:
|
||||
data_type = I_DATA_TYPE[unpack_uint(buff[2:4])]
|
||||
except Exception:
|
||||
data_type = "None"
|
||||
|
||||
offset = unpack_dint(buff[4:8])
|
||||
type_list.append((array_size, data_type, offset))
|
||||
|
||||
buff = buff[8:]
|
||||
|
||||
tag['udt']['data_type'] = type_list
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def get_tag_list(self):
|
||||
self._tag_list = []
|
||||
# Step 1
|
||||
self._get_instance_attribute_list_service()
|
||||
|
||||
# Step 2
|
||||
self._isolating_user_tag()
|
||||
|
||||
# Step 3
|
||||
for tag in self._tag_list:
|
||||
if tag['tag_type'] == 'struct':
|
||||
tag['template'] = self._get_structure_makeup(tag['template_instance_id'])
|
||||
|
||||
for idx, tag in enumerate(self._tag_list):
|
||||
# print (tag)
|
||||
if tag['tag_type'] == 'struct':
|
||||
self._parse_udt_raw(tag)
|
||||
|
||||
# Step 4
|
||||
|
||||
return self._tag_list
|
||||
574
daq/pycomm-master/pycomm/ab_comm/slc.py
Executable file
@@ -0,0 +1,574 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# clx.py - Ethernet/IP Client for Rockwell PLCs
|
||||
#
|
||||
#
|
||||
# Copyright (c) 2014 Agostino Ruscito <ruscito@gmail.com>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in all
|
||||
# copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
# SOFTWARE.
|
||||
#
|
||||
from pycomm.cip.cip_base import *
|
||||
import re
|
||||
import math
|
||||
#import binascii
|
||||
|
||||
import logging
|
||||
try: # Python 2.7+
|
||||
from logging import NullHandler
|
||||
except ImportError:
|
||||
class NullHandler(logging.Handler):
|
||||
def emit(self, record):
|
||||
pass
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.addHandler(NullHandler())
|
||||
|
||||
|
||||
def parse_tag(tag):
|
||||
t = re.search(r"(?P<file_type>[CT])(?P<file_number>\d{1,3})"
|
||||
r"(:)(?P<element_number>\d{1,3})"
|
||||
r"(.)(?P<sub_element>ACC|PRE|EN|DN|TT|CU|CD|DN|OV|UN|UA)", tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if (1 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255):
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': PCCC_CT[t.group('sub_element').upper()],
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
|
||||
t = re.search(r"(?P<file_type>[LFBN])(?P<file_number>\d{1,3})"
|
||||
r"(:)(?P<element_number>\d{1,3})"
|
||||
r"(/(?P<sub_element>\d{1,2}))?",
|
||||
tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if t.group('sub_element') is not None:
|
||||
if (1 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255) \
|
||||
and (0 <= int(t.group('sub_element')) <= 15):
|
||||
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': t.group('sub_element'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
else:
|
||||
if (1 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255):
|
||||
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': t.group('sub_element'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 2}
|
||||
|
||||
t = re.search(r"(?P<file_type>[IO])(:)(?P<file_number>\d{1,3})"
|
||||
r"(.)(?P<element_number>\d{1,3})"
|
||||
r"(/(?P<sub_element>\d{1,2}))?", tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if t.group('sub_element') is not None:
|
||||
if (0 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255) \
|
||||
and (0 <= int(t.group('sub_element')) <= 15):
|
||||
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': t.group('sub_element'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
else:
|
||||
if (0 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255):
|
||||
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 2}
|
||||
|
||||
t = re.search(r"(?P<file_type>S)"
|
||||
r"(:)(?P<element_number>\d{1,3})"
|
||||
r"(/(?P<sub_element>\d{1,2}))?", tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if t.group('sub_element') is not None:
|
||||
if (0 <= int(t.group('element_number')) <= 255) \
|
||||
and (0 <= int(t.group('sub_element')) <= 15):
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': '2',
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': t.group('sub_element'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
else:
|
||||
if 0 <= int(t.group('element_number')) <= 255:
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': '2',
|
||||
'element_number': t.group('element_number'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 2}
|
||||
|
||||
t = re.search(r"(?P<file_type>B)(?P<file_number>\d{1,3})"
|
||||
r"(/)(?P<element_number>\d{1,4})",
|
||||
tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if (1 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 4095):
|
||||
bit_position = int(t.group('element_number'))
|
||||
element_number = bit_position / 16
|
||||
sub_element = bit_position - (element_number * 16)
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': element_number,
|
||||
'sub_element': sub_element,
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
|
||||
return False, tag
|
||||
|
||||
|
||||
class Driver(Base):
|
||||
"""
|
||||
SLC/PLC_5 Implementation
|
||||
"""
|
||||
def __init__(self):
|
||||
super(Driver, self).__init__()
|
||||
|
||||
self.__version__ = '0.1'
|
||||
self._last_sequence = 0
|
||||
|
||||
def _check_reply(self):
|
||||
"""
|
||||
check the replayed message for error
|
||||
"""
|
||||
self._more_packets_available = False
|
||||
try:
|
||||
if self._reply is None:
|
||||
self._status = (3, '%s without reply' % REPLAY_INFO[unpack_dint(self._message[:2])])
|
||||
return False
|
||||
# Get the type of command
|
||||
typ = unpack_uint(self._reply[:2])
|
||||
|
||||
# Encapsulation status check
|
||||
if unpack_dint(self._reply[8:12]) != SUCCESS:
|
||||
self._status = (3, "{0} reply status:{1}".format(REPLAY_INFO[typ],
|
||||
SERVICE_STATUS[unpack_dint(self._reply[8:12])]))
|
||||
return False
|
||||
|
||||
# Command Specific Status check
|
||||
if typ == unpack_uint(ENCAPSULATION_COMMAND["send_rr_data"]):
|
||||
status = unpack_usint(self._reply[42:43])
|
||||
if status != SUCCESS:
|
||||
self._status = (3, "send_rr_data reply:{0} - Extend status:{1}".format(
|
||||
SERVICE_STATUS[status], get_extended_status(self._reply, 42)))
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
elif typ == unpack_uint(ENCAPSULATION_COMMAND["send_unit_data"]):
|
||||
status = unpack_usint(self._reply[48:49])
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Read Tag Fragmented"]:
|
||||
self._parse_fragment(50, status)
|
||||
return True
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Get Instance Attributes List"]:
|
||||
self._parse_tag_list(50, status)
|
||||
return True
|
||||
if status == 0x06:
|
||||
self._status = (3, "Insufficient Packet Space")
|
||||
self._more_packets_available = True
|
||||
elif status != SUCCESS:
|
||||
self._status = (3, "send_unit_data reply:{0} - Extend status:{1}".format(
|
||||
SERVICE_STATUS[status], get_extended_status(self._reply, 48)))
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def __queue_data_available(self, queue_number):
|
||||
""" read the queue
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
print c.read_tag('F8:0', 3) return a list of 3 registers starting from F8:0
|
||||
print c.read_tag('F8:0') return one value
|
||||
|
||||
It is possible to read status bit
|
||||
|
||||
:return: None is returned in case of error
|
||||
"""
|
||||
|
||||
# Creating the Message Request Packet
|
||||
self._last_sequence = pack_uint(Base._get_sequence())
|
||||
|
||||
# PCCC_Cmd_Rd_w3_Q2 = [0x0f, 0x00, 0x30, 0x00, 0xa2, 0x6d, 0x00, 0xa5, 0x02, 0x00]
|
||||
message_request = [
|
||||
self._last_sequence,
|
||||
'\x4b',
|
||||
'\x02',
|
||||
CLASS_ID["8-bit"],
|
||||
PATH["PCCC"],
|
||||
'\x07',
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
'\x0f',
|
||||
'\x00',
|
||||
self._last_sequence[1],
|
||||
self._last_sequence[0],
|
||||
'\xa2', # protected typed logical read with three address fields FNC
|
||||
'\x6d', # Byte size to read = 109
|
||||
'\x00', # File Number
|
||||
'\xa5', # File Type
|
||||
pack_uint(queue_number)
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)):
|
||||
|
||||
sts = int(unpack_uint(self._reply[2:4]))
|
||||
if sts == 146:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
else:
|
||||
raise DataError("read_queue [send_unit_data] returned not valid data")
|
||||
|
||||
def __save_record(self, filename):
|
||||
with open(filename, "a") as csv_file:
|
||||
logger.debug("SLC __save_record read:{0}".format(self._reply[61:]))
|
||||
csv_file.write(self._reply[61:]+'\n')
|
||||
csv_file.close()
|
||||
|
||||
def __get_queue_size(self, queue_number):
|
||||
""" get queue size
|
||||
"""
|
||||
# Creating the Message Request Packet
|
||||
self._last_sequence = pack_uint(Base._get_sequence())
|
||||
|
||||
message_request = [
|
||||
self._last_sequence,
|
||||
'\x4b',
|
||||
'\x02',
|
||||
CLASS_ID["8-bit"],
|
||||
PATH["PCCC"],
|
||||
'\x07',
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
'\x0f',
|
||||
'\x00',
|
||||
self._last_sequence[1],
|
||||
self._last_sequence[0],
|
||||
# '\x30',
|
||||
# '\x00',
|
||||
'\xa1', # FNC to get the queue size
|
||||
'\x06', # Byte size to read = 06
|
||||
'\x00', # File Number
|
||||
'\xea', # File Type ????
|
||||
'\xff', # File Type ????
|
||||
pack_uint(queue_number)
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)):
|
||||
sts = int(unpack_uint(self._reply[65:67]))
|
||||
logger.debug("SLC __get_queue_size({0}) returned {1}".format(queue_number, sts))
|
||||
return sts
|
||||
else:
|
||||
raise DataError("read_queue [send_unit_data] returned not valid data")
|
||||
|
||||
def read_queue(self, queue_number, file_name):
|
||||
""" read the queue
|
||||
|
||||
"""
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (5, "Target did not connected. is_queue_available will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. is_queue_available will not be executed.")
|
||||
|
||||
if self.__queue_data_available(queue_number):
|
||||
logger.debug("SLC read_queue: Queue {0} has data".format(queue_number))
|
||||
self.__save_record(file_name + str(queue_number) + ".csv")
|
||||
size = self.__get_queue_size(queue_number)
|
||||
if size > 0:
|
||||
for i in range(0, size):
|
||||
if self.__queue_data_available(queue_number):
|
||||
self.__save_record(file_name + str(queue_number) + ".csv")
|
||||
|
||||
logger.debug("SLC read_queue: {0} record extract from queue {1}".format(size, queue_number))
|
||||
else:
|
||||
logger.debug("SLC read_queue: Queue {0} has no data".format(queue_number))
|
||||
|
||||
def read_tag(self, tag, n=1):
|
||||
""" read tag from a connected plc
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
print c.read_tag('F8:0', 3) return a list of 3 registers starting from F8:0
|
||||
print c.read_tag('F8:0') return one value
|
||||
|
||||
It is possible to read status bit
|
||||
|
||||
:return: None is returned in case of error
|
||||
"""
|
||||
res = parse_tag(tag)
|
||||
if not res[0]:
|
||||
self._status = (1000, "Error parsing the tag passed to read_tag({0},{1})".format(tag, n))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error parsing the tag passed to read_tag({0},{1})".format(tag, n))
|
||||
|
||||
bit_read = False
|
||||
bit_position = 0
|
||||
sub_element = 0
|
||||
if int(res[2]['address_field'] == 3):
|
||||
bit_read = True
|
||||
bit_position = int(res[2]['sub_element'])
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (5, "Target did not connected. read_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. read_tag will not be executed.")
|
||||
|
||||
data_size = PCCC_DATA_SIZE[res[2]['file_type']]
|
||||
|
||||
# Creating the Message Request Packet
|
||||
self._last_sequence = pack_uint(Base._get_sequence())
|
||||
|
||||
message_request = [
|
||||
self._last_sequence,
|
||||
'\x4b',
|
||||
'\x02',
|
||||
CLASS_ID["8-bit"],
|
||||
PATH["PCCC"],
|
||||
'\x07',
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
'\x0f',
|
||||
'\x00',
|
||||
self._last_sequence[1],
|
||||
self._last_sequence[0],
|
||||
res[2]['read_func'],
|
||||
pack_usint(data_size * n),
|
||||
pack_usint(int(res[2]['file_number'])),
|
||||
PCCC_DATA_TYPE[res[2]['file_type']],
|
||||
pack_usint(int(res[2]['element_number'])),
|
||||
pack_usint(sub_element)
|
||||
]
|
||||
|
||||
logger.debug("SLC read_tag({0},{1})".format(tag, n))
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)):
|
||||
sts = int(unpack_usint(self._reply[58]))
|
||||
try:
|
||||
if sts != 0:
|
||||
sts_txt = PCCC_ERROR_CODE[sts]
|
||||
self._status = (1000, "Error({0}) returned from read_tag({1},{2})".format(sts_txt, tag, n))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) returned from read_tag({1},{2})".format(sts_txt, tag, n))
|
||||
|
||||
new_value = 61
|
||||
if bit_read:
|
||||
if res[2]['file_type'] == 'T' or res[2]['file_type'] == 'C':
|
||||
if bit_position == PCCC_CT['PRE']:
|
||||
return UNPACK_PCCC_DATA_FUNCTION[res[2]['file_type']](
|
||||
self._reply[new_value+2:new_value+2+data_size])
|
||||
elif bit_position == PCCC_CT['ACC']:
|
||||
return UNPACK_PCCC_DATA_FUNCTION[res[2]['file_type']](
|
||||
self._reply[new_value+4:new_value+4+data_size])
|
||||
|
||||
tag_value = UNPACK_PCCC_DATA_FUNCTION[res[2]['file_type']](
|
||||
self._reply[new_value:new_value+data_size])
|
||||
return get_bit(tag_value, bit_position)
|
||||
|
||||
else:
|
||||
values_list = []
|
||||
while len(self._reply[new_value:]) >= data_size:
|
||||
values_list.append(
|
||||
UNPACK_PCCC_DATA_FUNCTION[res[2]['file_type']](self._reply[new_value:new_value+data_size])
|
||||
)
|
||||
new_value = new_value+data_size
|
||||
|
||||
if len(values_list) > 1:
|
||||
return values_list
|
||||
else:
|
||||
return values_list[0]
|
||||
|
||||
except Exception as e:
|
||||
self._status = (1000, "Error({0}) parsing the data returned from read_tag({1},{2})".format(e, tag, n))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) parsing the data returned from read_tag({1},{2})".format(e, tag, n))
|
||||
else:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
def write_tag(self, tag, value):
|
||||
""" write tag from a connected plc
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
c.write_tag('N7:0', [-30, 32767, -32767])
|
||||
c.write_tag('N7:0', 21)
|
||||
c.read_tag('N7:0', 10)
|
||||
|
||||
It is not possible to write status bit
|
||||
|
||||
:return: None is returned in case of error
|
||||
"""
|
||||
res = parse_tag(tag)
|
||||
if not res[0]:
|
||||
self._status = (1000, "Error parsing the tag passed to read_tag({0},{1})".format(tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error parsing the tag passed to read_tag({0},{1})".format(tag, value))
|
||||
|
||||
if isinstance(value, list) and int(res[2]['address_field'] == 3):
|
||||
self._status = (1000, "Function's parameters error. read_tag({0},{1})".format(tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Function's parameters error. read_tag({0},{1})".format(tag, value))
|
||||
|
||||
if isinstance(value, list) and int(res[2]['address_field'] == 3):
|
||||
self._status = (1000, "Function's parameters error. read_tag({0},{1})".format(tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Function's parameters error. read_tag({0},{1})".format(tag, value))
|
||||
|
||||
bit_field = False
|
||||
bit_position = 0
|
||||
sub_element = 0
|
||||
if int(res[2]['address_field'] == 3):
|
||||
bit_field = True
|
||||
bit_position = int(res[2]['sub_element'])
|
||||
values_list = ''
|
||||
else:
|
||||
values_list = '\xff\xff'
|
||||
|
||||
multi_requests = False
|
||||
if isinstance(value, list):
|
||||
multi_requests = True
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (1000, "Target did not connected. write_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. write_tag will not be executed.")
|
||||
|
||||
try:
|
||||
n = 0
|
||||
if multi_requests:
|
||||
data_size = PCCC_DATA_SIZE[res[2]['file_type']]
|
||||
for v in value:
|
||||
values_list += PACK_PCCC_DATA_FUNCTION[res[2]['file_type']](v)
|
||||
n += 1
|
||||
else:
|
||||
n = 1
|
||||
if bit_field:
|
||||
data_size = 2
|
||||
|
||||
if (res[2]['file_type'] == 'T' or res[2]['file_type'] == 'C') \
|
||||
and (bit_position == PCCC_CT['PRE'] or bit_position == PCCC_CT['ACC']):
|
||||
sub_element = bit_position
|
||||
values_list = '\xff\xff' + PACK_PCCC_DATA_FUNCTION[res[2]['file_type']](value)
|
||||
else:
|
||||
sub_element = 0
|
||||
if value > 0:
|
||||
values_list = pack_uint(math.pow(2, bit_position)) + pack_uint(math.pow(2, bit_position))
|
||||
else:
|
||||
values_list = pack_uint(math.pow(2, bit_position)) + pack_uint(0)
|
||||
|
||||
else:
|
||||
values_list += PACK_PCCC_DATA_FUNCTION[res[2]['file_type']](value)
|
||||
data_size = PCCC_DATA_SIZE[res[2]['file_type']]
|
||||
|
||||
except Exception as e:
|
||||
self._status = (1000, "Error({0}) packing the values to write to the"
|
||||
"SLC write_tag({1},{2})".format(e, tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) packing the values to write to the "
|
||||
"SLC write_tag({1},{2})".format(e, tag, value))
|
||||
|
||||
data_to_write = values_list
|
||||
|
||||
# Creating the Message Request Packet
|
||||
self._last_sequence = pack_uint(Base._get_sequence())
|
||||
|
||||
message_request = [
|
||||
self._last_sequence,
|
||||
'\x4b',
|
||||
'\x02',
|
||||
CLASS_ID["8-bit"],
|
||||
PATH["PCCC"],
|
||||
'\x07',
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
'\x0f',
|
||||
'\x00',
|
||||
self._last_sequence[1],
|
||||
self._last_sequence[0],
|
||||
res[2]['write_func'],
|
||||
pack_usint(data_size * n),
|
||||
pack_usint(int(res[2]['file_number'])),
|
||||
PCCC_DATA_TYPE[res[2]['file_type']],
|
||||
pack_usint(int(res[2]['element_number'])),
|
||||
pack_usint(sub_element)
|
||||
]
|
||||
|
||||
logger.debug("SLC write_tag({0},{1})".format(tag, value))
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request) + data_to_write,
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)):
|
||||
sts = int(unpack_usint(self._reply[58]))
|
||||
try:
|
||||
if sts != 0:
|
||||
sts_txt = PCCC_ERROR_CODE[sts]
|
||||
self._status = (1000, "Error({0}) returned from SLC write_tag({1},{2})".format(sts_txt, tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) returned from SLC write_tag({1},{2})".format(sts_txt, tag, value))
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
self._status = (1000, "Error({0}) parsing the data returned from "
|
||||
"SLC write_tag({1},{2})".format(e, tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) parsing the data returned from "
|
||||
"SLC write_tag({1},{2})".format(e, tag, value))
|
||||
else:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
1
daq/pycomm-master/pycomm/cip/__init__.py
Executable file
@@ -0,0 +1 @@
|
||||
__author__ = 'agostino'
|
||||
864
daq/pycomm-master/pycomm/cip/cip_base.py
Executable file
@@ -0,0 +1,864 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# cip_base.py - A set of classes methods and structures used to implement Ethernet/IP
|
||||
#
|
||||
#
|
||||
# Copyright (c) 2014 Agostino Ruscito <ruscito@gmail.com>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in all
|
||||
# copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
# SOFTWARE.
|
||||
#
|
||||
|
||||
import struct
|
||||
import socket
|
||||
import random
|
||||
|
||||
from os import getpid
|
||||
from pycomm.cip.cip_const import *
|
||||
from pycomm.common import PycommError
|
||||
|
||||
|
||||
import logging
|
||||
try: # Python 2.7+
|
||||
from logging import NullHandler
|
||||
except ImportError:
|
||||
class NullHandler(logging.Handler):
|
||||
def emit(self, record):
|
||||
pass
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.addHandler(NullHandler())
|
||||
|
||||
|
||||
class CommError(PycommError):
|
||||
pass
|
||||
|
||||
|
||||
class DataError(PycommError):
|
||||
pass
|
||||
|
||||
|
||||
def pack_sint(n):
|
||||
return struct.pack('b', n)
|
||||
|
||||
|
||||
def pack_usint(n):
|
||||
return struct.pack('B', n)
|
||||
|
||||
|
||||
def pack_int(n):
|
||||
"""pack 16 bit into 2 bytes little endian"""
|
||||
return struct.pack('<h', n)
|
||||
|
||||
|
||||
def pack_uint(n):
|
||||
"""pack 16 bit into 2 bytes little endian"""
|
||||
return struct.pack('<H', n)
|
||||
|
||||
|
||||
def pack_dint(n):
|
||||
"""pack 32 bit into 4 bytes little endian"""
|
||||
return struct.pack('<i', n)
|
||||
|
||||
|
||||
def pack_real(r):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return struct.pack('<f', r)
|
||||
|
||||
|
||||
def pack_lint(l):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return struct.unpack('<q', l)
|
||||
|
||||
|
||||
def unpack_bool(st):
|
||||
if not (int(struct.unpack('B', st[0])[0]) == 0):
|
||||
return 1
|
||||
return 0
|
||||
|
||||
|
||||
def unpack_sint(st):
|
||||
return int(struct.unpack('b', st[0])[0])
|
||||
|
||||
|
||||
def unpack_usint(st):
|
||||
return int(struct.unpack('B', st[0])[0])
|
||||
|
||||
|
||||
def unpack_int(st):
|
||||
"""unpack 2 bytes little endian to int"""
|
||||
return int(struct.unpack('<h', st[0:2])[0])
|
||||
|
||||
|
||||
def unpack_uint(st):
|
||||
"""unpack 2 bytes little endian to int"""
|
||||
return int(struct.unpack('<H', st[0:2])[0])
|
||||
|
||||
|
||||
def unpack_dint(st):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return int(struct.unpack('<i', st[0:4])[0])
|
||||
|
||||
|
||||
def unpack_real(st):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return float(struct.unpack('<f', st[0:4])[0])
|
||||
|
||||
|
||||
def unpack_lint(st):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return int(struct.unpack('<q', st[0:8])[0])
|
||||
|
||||
|
||||
def get_bit(value, idx):
|
||||
""":returns value of bit at position idx"""
|
||||
return (value & (1 << idx)) != 0
|
||||
|
||||
|
||||
PACK_DATA_FUNCTION = {
|
||||
'BOOL': pack_sint,
|
||||
'SINT': pack_sint, # Signed 8-bit integer
|
||||
'INT': pack_int, # Signed 16-bit integer
|
||||
'UINT': pack_uint, # Unsigned 16-bit integer
|
||||
'USINT': pack_usint, # Unsigned Byte Integer
|
||||
'DINT': pack_dint, # Signed 32-bit integer
|
||||
'REAL': pack_real, # 32-bit floating point
|
||||
'LINT': pack_lint,
|
||||
'BYTE': pack_sint, # byte string 8-bits
|
||||
'WORD': pack_uint, # byte string 16-bits
|
||||
'DWORD': pack_dint, # byte string 32-bits
|
||||
'LWORD': pack_lint # byte string 64-bits
|
||||
}
|
||||
|
||||
|
||||
UNPACK_DATA_FUNCTION = {
|
||||
'BOOL': unpack_bool,
|
||||
'SINT': unpack_sint, # Signed 8-bit integer
|
||||
'INT': unpack_int, # Signed 16-bit integer
|
||||
'UINT': unpack_uint, # Unsigned 16-bit integer
|
||||
'USINT': unpack_usint, # Unsigned Byte Integer
|
||||
'DINT': unpack_dint, # Signed 32-bit integer
|
||||
'REAL': unpack_real, # 32-bit floating point,
|
||||
'LINT': unpack_lint,
|
||||
'BYTE': unpack_sint, # byte string 8-bits
|
||||
'WORD': unpack_uint, # byte string 16-bits
|
||||
'DWORD': unpack_dint, # byte string 32-bits
|
||||
'LWORD': unpack_lint # byte string 64-bits
|
||||
}
|
||||
|
||||
|
||||
DATA_FUNCTION_SIZE = {
|
||||
'BOOL': 1,
|
||||
'SINT': 1, # Signed 8-bit integer
|
||||
'USINT': 1, # Unisgned 8-bit integer
|
||||
'INT': 2, # Signed 16-bit integer
|
||||
'UINT': 2, # Unsigned 16-bit integer
|
||||
'DINT': 4, # Signed 32-bit integer
|
||||
'REAL': 4, # 32-bit floating point
|
||||
'LINT': 8,
|
||||
'BYTE': 1, # byte string 8-bits
|
||||
'WORD': 2, # byte string 16-bits
|
||||
'DWORD': 4, # byte string 32-bits
|
||||
'LWORD': 8 # byte string 64-bits
|
||||
}
|
||||
|
||||
UNPACK_PCCC_DATA_FUNCTION = {
|
||||
'N': unpack_int,
|
||||
'B': unpack_int,
|
||||
'T': unpack_int,
|
||||
'C': unpack_int,
|
||||
'S': unpack_int,
|
||||
'F': unpack_real,
|
||||
'A': unpack_sint,
|
||||
'R': unpack_dint,
|
||||
'O': unpack_int,
|
||||
'I': unpack_int
|
||||
}
|
||||
|
||||
PACK_PCCC_DATA_FUNCTION = {
|
||||
'N': pack_int,
|
||||
'B': pack_int,
|
||||
'T': pack_int,
|
||||
'C': pack_int,
|
||||
'S': pack_int,
|
||||
'F': pack_real,
|
||||
'A': pack_sint,
|
||||
'R': pack_dint,
|
||||
'O': pack_int,
|
||||
'I': pack_int
|
||||
}
|
||||
|
||||
def print_bytes_line(msg):
|
||||
out = ''
|
||||
for ch in msg:
|
||||
out += "{:0>2x}".format(ord(ch))
|
||||
return out
|
||||
|
||||
|
||||
def print_bytes_msg(msg, info=''):
|
||||
out = info
|
||||
new_line = True
|
||||
line = 0
|
||||
column = 0
|
||||
for idx, ch in enumerate(msg):
|
||||
if new_line:
|
||||
out += "\n({:0>4d}) ".format(line * 10)
|
||||
new_line = False
|
||||
out += "{:0>2x} ".format(ord(ch))
|
||||
if column == 9:
|
||||
new_line = True
|
||||
column = 0
|
||||
line += 1
|
||||
else:
|
||||
column += 1
|
||||
return out
|
||||
|
||||
|
||||
def get_extended_status(msg, start):
|
||||
status = unpack_usint(msg[start:start+1])
|
||||
# send_rr_data
|
||||
# 42 General Status
|
||||
# 43 Size of additional status
|
||||
# 44..n additional status
|
||||
|
||||
# send_unit_data
|
||||
# 48 General Status
|
||||
# 49 Size of additional status
|
||||
# 50..n additional status
|
||||
extended_status_size = (unpack_usint(msg[start+1:start+2]))*2
|
||||
extended_status = 0
|
||||
if extended_status_size != 0:
|
||||
# There is an additional status
|
||||
if extended_status_size == 1:
|
||||
extended_status = unpack_usint(msg[start+2:start+3])
|
||||
elif extended_status_size == 2:
|
||||
extended_status = unpack_uint(msg[start+2:start+4])
|
||||
elif extended_status_size == 4:
|
||||
extended_status = unpack_dint(msg[start+2:start+6])
|
||||
else:
|
||||
return 'Extended Status Size Unknown'
|
||||
try:
|
||||
return '{0}'.format(EXTEND_CODES[status][extended_status])
|
||||
except LookupError:
|
||||
return "Extended Status info not present"
|
||||
|
||||
|
||||
def create_tag_rp(tag, multi_requests=False):
|
||||
""" Create tag Request Packet
|
||||
|
||||
It returns the request packed wrapped around the tag passed.
|
||||
If any error it returns none
|
||||
"""
|
||||
tags = tag.split('.')
|
||||
rp = []
|
||||
index = []
|
||||
for tag in tags:
|
||||
add_index = False
|
||||
# Check if is an array tag
|
||||
if tag.find('[') != -1:
|
||||
# Remove the last square bracket
|
||||
tag = tag[:len(tag)-1]
|
||||
# Isolate the value inside bracket
|
||||
inside_value = tag[tag.find('[')+1:]
|
||||
# Now split the inside value in case part of multidimensional array
|
||||
index = inside_value.split(',')
|
||||
# Flag the existence of one o more index
|
||||
add_index = True
|
||||
# Get only the tag part
|
||||
tag = tag[:tag.find('[')]
|
||||
tag_length = len(tag)
|
||||
|
||||
# Create the request path
|
||||
rp.append(EXTENDED_SYMBOL) # ANSI Ext. symbolic segment
|
||||
rp.append(chr(tag_length)) # Length of the tag
|
||||
|
||||
# Add the tag to the Request path
|
||||
for char in tag:
|
||||
rp.append(char)
|
||||
# Add pad byte because total length of Request path must be word-aligned
|
||||
if tag_length % 2:
|
||||
rp.append(PADDING_BYTE)
|
||||
# Add any index
|
||||
if add_index:
|
||||
for idx in index:
|
||||
val = int(idx)
|
||||
if val <= 0xff:
|
||||
rp.append(ELEMENT_ID["8-bit"])
|
||||
rp.append(pack_usint(val))
|
||||
elif val <= 0xffff:
|
||||
rp.append(ELEMENT_ID["16-bit"]+PADDING_BYTE)
|
||||
rp.append(pack_uint(val))
|
||||
elif val <= 0xfffffffff:
|
||||
rp.append(ELEMENT_ID["32-bit"]+PADDING_BYTE)
|
||||
rp.append(pack_dint(val))
|
||||
else:
|
||||
# Cannot create a valid request packet
|
||||
return None
|
||||
|
||||
# At this point the Request Path is completed,
|
||||
if multi_requests:
|
||||
request_path = chr(len(rp)/2) + ''.join(rp)
|
||||
else:
|
||||
request_path = ''.join(rp)
|
||||
return request_path
|
||||
|
||||
|
||||
def build_common_packet_format(message_type, message, addr_type, addr_data=None, timeout=10):
|
||||
""" build_common_packet_format
|
||||
|
||||
It creates the common part for a CIP message. Check Volume 2 (page 2.22) of CIP specification for reference
|
||||
"""
|
||||
msg = pack_dint(0) # Interface Handle: shall be 0 for CIP
|
||||
msg += pack_uint(timeout) # timeout
|
||||
msg += pack_uint(2) # Item count: should be at list 2 (Address and Data)
|
||||
msg += addr_type # Address Item Type ID
|
||||
|
||||
if addr_data is not None:
|
||||
msg += pack_uint(len(addr_data)) # Address Item Length
|
||||
msg += addr_data
|
||||
else:
|
||||
msg += pack_uint(0) # Address Item Length
|
||||
msg += message_type # Data Type ID
|
||||
msg += pack_uint(len(message)) # Data Item Length
|
||||
msg += message
|
||||
return msg
|
||||
|
||||
|
||||
def build_multiple_service(rp_list, sequence=None):
|
||||
|
||||
mr = []
|
||||
if sequence is not None:
|
||||
mr.append(pack_uint(sequence))
|
||||
|
||||
mr.append(chr(TAG_SERVICES_REQUEST["Multiple Service Packet"])) # the Request Service
|
||||
mr.append(pack_usint(2)) # the Request Path Size length in word
|
||||
mr.append(CLASS_ID["8-bit"])
|
||||
mr.append(CLASS_CODE["Message Router"])
|
||||
mr.append(INSTANCE_ID["8-bit"])
|
||||
mr.append(pack_usint(1)) # Instance 1
|
||||
mr.append(pack_uint(len(rp_list))) # Number of service contained in the request
|
||||
|
||||
# Offset calculation
|
||||
offset = (len(rp_list) * 2) + 2
|
||||
for index, rp in enumerate(rp_list):
|
||||
if index == 0:
|
||||
mr.append(pack_uint(offset)) # Starting offset
|
||||
else:
|
||||
mr.append(pack_uint(offset))
|
||||
offset += len(rp)
|
||||
|
||||
for rp in rp_list:
|
||||
mr.append(rp)
|
||||
return mr
|
||||
|
||||
|
||||
def parse_multiple_request(message, tags, typ):
|
||||
""" parse_multi_request
|
||||
This function should be used to parse the message replayed to a multi request service rapped around the
|
||||
send_unit_data message.
|
||||
|
||||
|
||||
:param message: the full message returned from the PLC
|
||||
:param tags: The list of tags to be read
|
||||
:param typ: to specify if multi request service READ or WRITE
|
||||
:return: a list of tuple in the format [ (tag name, value, data type), ( tag name, value, data type) ].
|
||||
In case of error the tuple will be (tag name, None, None)
|
||||
"""
|
||||
offset = 50
|
||||
position = 50
|
||||
number_of_service_replies = unpack_uint(message[offset:offset+2])
|
||||
tag_list = []
|
||||
for index in range(number_of_service_replies):
|
||||
position += 2
|
||||
start = offset + unpack_uint(message[position:position+2])
|
||||
general_status = unpack_usint(message[start+2:start+3])
|
||||
|
||||
if general_status == 0:
|
||||
if typ == "READ":
|
||||
data_type = unpack_uint(message[start+4:start+6])
|
||||
try:
|
||||
value_begin = start + 6
|
||||
value_end = value_begin + DATA_FUNCTION_SIZE[I_DATA_TYPE[data_type]]
|
||||
value = message[value_begin:value_end]
|
||||
tag_list.append((tags[index],
|
||||
UNPACK_DATA_FUNCTION[I_DATA_TYPE[data_type]](value),
|
||||
I_DATA_TYPE[data_type]))
|
||||
except LookupError:
|
||||
tag_list.append((tags[index], None, None))
|
||||
else:
|
||||
tag_list.append((tags[index] + ('GOOD',)))
|
||||
else:
|
||||
if typ == "READ":
|
||||
tag_list.append((tags[index], None, None))
|
||||
else:
|
||||
tag_list.append((tags[index] + ('BAD',)))
|
||||
return tag_list
|
||||
|
||||
|
||||
class Socket:
|
||||
|
||||
def __init__(self, timeout=5.0):
|
||||
self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
self.sock.settimeout(timeout)
|
||||
self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
|
||||
|
||||
def connect(self, host, port):
|
||||
try:
|
||||
self.sock.connect((host, port))
|
||||
except socket.timeout:
|
||||
raise CommError("Socket timeout during connection.")
|
||||
|
||||
def send(self, msg, timeout=0):
|
||||
if timeout != 0:
|
||||
self.sock.settimeout(timeout)
|
||||
total_sent = 0
|
||||
while total_sent < len(msg):
|
||||
try:
|
||||
sent = self.sock.send(msg[total_sent:])
|
||||
if sent == 0:
|
||||
raise CommError("socket connection broken.")
|
||||
total_sent += sent
|
||||
except socket.error:
|
||||
raise CommError("socket connection broken.")
|
||||
return total_sent
|
||||
|
||||
def receive(self, timeout=0):
|
||||
if timeout != 0:
|
||||
self.sock.settimeout(timeout)
|
||||
msg_len = 28
|
||||
chunks = []
|
||||
bytes_recd = 0
|
||||
one_shot = True
|
||||
while bytes_recd < msg_len:
|
||||
try:
|
||||
chunk = self.sock.recv(min(msg_len - bytes_recd, 2048))
|
||||
if chunk == '':
|
||||
raise CommError("socket connection broken.")
|
||||
if one_shot:
|
||||
data_size = int(struct.unpack('<H', chunk[2:4])[0]) # Length
|
||||
msg_len = HEADER_SIZE + data_size
|
||||
one_shot = False
|
||||
|
||||
chunks.append(chunk)
|
||||
bytes_recd += len(chunk)
|
||||
except socket.error as e:
|
||||
raise CommError(e)
|
||||
return ''.join(chunks)
|
||||
|
||||
def close(self):
|
||||
self.sock.close()
|
||||
|
||||
|
||||
def parse_symbol_type(symbol):
|
||||
""" parse_symbol_type
|
||||
|
||||
It parse the symbol to Rockwell Spec
|
||||
:param symbol: the symbol associated to a tag
|
||||
:return: A tuple containing information about the tag
|
||||
"""
|
||||
pass
|
||||
|
||||
return None
|
||||
|
||||
|
||||
class Base(object):
|
||||
_sequence = 0
|
||||
|
||||
|
||||
def __init__(self):
|
||||
if Base._sequence == 0:
|
||||
Base._sequence = getpid()
|
||||
else:
|
||||
Base._sequence = Base._get_sequence()
|
||||
|
||||
self.__version__ = '0.3'
|
||||
self.__sock = None
|
||||
self.__direct_connections = False
|
||||
self._session = 0
|
||||
self._connection_opened = False
|
||||
self._reply = None
|
||||
self._message = None
|
||||
self._target_cid = None
|
||||
self._target_is_connected = False
|
||||
self._tag_list = []
|
||||
self._buffer = {}
|
||||
self._device_description = "Device Unknown"
|
||||
self._last_instance = 0
|
||||
self._byte_offset = 0
|
||||
self._last_position = 0
|
||||
self._more_packets_available = False
|
||||
self._last_tag_read = ()
|
||||
self._last_tag_write = ()
|
||||
self._status = (0, "")
|
||||
self._output_raw = False # indicating value should be output as raw (hex)
|
||||
|
||||
self.attribs = {'context': '_pycomm_', 'protocol version': 1, 'rpi': 5000, 'port': 0xAF12, 'timeout': 10,
|
||||
'backplane': 1, 'cpu slot': 0, 'option': 0, 'cid': '\x27\x04\x19\x71', 'csn': '\x27\x04',
|
||||
'vid': '\x09\x10', 'vsn': '\x09\x10\x19\x71', 'name': 'Base', 'ip address': None}
|
||||
|
||||
def __len__(self):
|
||||
return len(self.attribs)
|
||||
|
||||
def __getitem__(self, key):
|
||||
return self.attribs[key]
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
self.attribs[key] = value
|
||||
|
||||
def __delitem__(self, key):
|
||||
try:
|
||||
del self.attribs[key]
|
||||
except LookupError:
|
||||
pass
|
||||
|
||||
def __iter__(self):
|
||||
return iter(self.attribs)
|
||||
|
||||
def __contains__(self, item):
|
||||
return item in self.attribs
|
||||
|
||||
def _check_reply(self):
|
||||
raise Socket.ImplementationError("The method has not been implemented")
|
||||
|
||||
@staticmethod
|
||||
def _get_sequence():
|
||||
""" Increase and return the sequence used with connected messages
|
||||
|
||||
:return: The New sequence
|
||||
"""
|
||||
if Base._sequence < 65535:
|
||||
Base._sequence += 1
|
||||
else:
|
||||
Base._sequence = getpid()
|
||||
return Base._sequence
|
||||
|
||||
def nop(self):
|
||||
""" No replay command
|
||||
|
||||
A NOP provides a way for either an originator or target to determine if the TCP connection is still open.
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND['nop'], 0)
|
||||
self._send()
|
||||
|
||||
def __repr__(self):
|
||||
return self._device_description
|
||||
|
||||
def generate_cid(self):
|
||||
self.attribs['cid'] = '{0}{1}{2}{3}'.format(chr(random.randint(0, 255)), chr(random.randint(0, 255))
|
||||
, chr(random.randint(0, 255)), chr(random.randint(0, 255)))
|
||||
|
||||
def description(self):
|
||||
return self._device_description
|
||||
|
||||
def list_identity(self):
|
||||
""" ListIdentity command to locate and identify potential target
|
||||
|
||||
return true if the replay contains the device description
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND['list_identity'], 0)
|
||||
self._send()
|
||||
self._receive()
|
||||
if self._check_reply():
|
||||
try:
|
||||
self._device_description = self._reply[63:-1]
|
||||
return True
|
||||
except Exception as e:
|
||||
raise CommError(e)
|
||||
return False
|
||||
|
||||
def send_rr_data(self, msg):
|
||||
""" SendRRData transfer an encapsulated request/reply packet between the originator and target
|
||||
|
||||
:param msg: The message to be send to the target
|
||||
:return: the replay received from the target
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND["send_rr_data"], len(msg))
|
||||
self._message += msg
|
||||
self._send()
|
||||
self._receive()
|
||||
return self._check_reply()
|
||||
|
||||
def send_unit_data(self, msg):
|
||||
""" SendUnitData send encapsulated connected messages.
|
||||
|
||||
:param msg: The message to be send to the target
|
||||
:return: the replay received from the target
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND["send_unit_data"], len(msg))
|
||||
self._message += msg
|
||||
self._send()
|
||||
self._receive()
|
||||
return self._check_reply()
|
||||
|
||||
def get_status(self):
|
||||
""" Get the last status/error
|
||||
|
||||
This method can be used after any call to get any details in case of error
|
||||
:return: A tuple containing (error group, error message)
|
||||
"""
|
||||
return self._status
|
||||
|
||||
def clear(self):
|
||||
""" Clear the last status/error
|
||||
|
||||
:return: return am empty tuple
|
||||
"""
|
||||
self._status = (0, "")
|
||||
|
||||
def build_header(self, command, length):
|
||||
""" Build the encapsulate message header
|
||||
|
||||
The header is 24 bytes fixed length, and includes the command and the length of the optional data portion.
|
||||
|
||||
:return: the headre
|
||||
"""
|
||||
try:
|
||||
h = command # Command UINT
|
||||
h += pack_uint(length) # Length UINT
|
||||
h += pack_dint(self._session) # Session Handle UDINT
|
||||
h += pack_dint(0) # Status UDINT
|
||||
h += self.attribs['context'] # Sender Context 8 bytes
|
||||
h += pack_dint(self.attribs['option']) # Option UDINT
|
||||
return h
|
||||
except Exception as e:
|
||||
raise CommError(e)
|
||||
|
||||
def register_session(self):
|
||||
""" Register a new session with the communication partner
|
||||
|
||||
:return: None if any error, otherwise return the session number
|
||||
"""
|
||||
if self._session:
|
||||
return self._session
|
||||
|
||||
self._session = 0
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND['register_session'], 4)
|
||||
self._message += pack_uint(self.attribs['protocol version'])
|
||||
self._message += pack_uint(0)
|
||||
self._send()
|
||||
self._receive()
|
||||
if self._check_reply():
|
||||
self._session = unpack_dint(self._reply[4:8])
|
||||
logger.debug("Session ={0} has been registered.".format(print_bytes_line(self._reply[4:8])))
|
||||
return self._session
|
||||
|
||||
self._status = 'Warning ! the session has not been registered.'
|
||||
logger.warning(self._status)
|
||||
return None
|
||||
|
||||
def forward_open(self):
|
||||
""" CIP implementation of the forward open message
|
||||
|
||||
Refer to ODVA documentation Volume 1 3-5.5.2
|
||||
|
||||
:return: False if any error in the replayed message
|
||||
"""
|
||||
if self._session == 0:
|
||||
self._status = (4, "A session need to be registered before to call forward_open.")
|
||||
raise CommError("A session need to be registered before to call forward open")
|
||||
|
||||
forward_open_msg = [
|
||||
FORWARD_OPEN,
|
||||
pack_usint(2),
|
||||
CLASS_ID["8-bit"],
|
||||
CLASS_CODE["Connection Manager"], # Volume 1: 5-1
|
||||
INSTANCE_ID["8-bit"],
|
||||
CONNECTION_MANAGER_INSTANCE['Open Request'],
|
||||
PRIORITY,
|
||||
TIMEOUT_TICKS,
|
||||
pack_dint(0),
|
||||
self.attribs['cid'],
|
||||
self.attribs['csn'],
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
TIMEOUT_MULTIPLIER,
|
||||
'\x00\x00\x00',
|
||||
pack_dint(self.attribs['rpi'] * 1000),
|
||||
pack_uint(CONNECTION_PARAMETER['Default']),
|
||||
pack_dint(self.attribs['rpi'] * 1000),
|
||||
pack_uint(CONNECTION_PARAMETER['Default']),
|
||||
TRANSPORT_CLASS, # Transport Class
|
||||
# CONNECTION_SIZE['Backplane'],
|
||||
# pack_usint(self.attribs['backplane']),
|
||||
# pack_usint(self.attribs['cpu slot']),
|
||||
CLASS_ID["8-bit"],
|
||||
CLASS_CODE["Message Router"],
|
||||
INSTANCE_ID["8-bit"],
|
||||
pack_usint(1)
|
||||
]
|
||||
|
||||
if self.__direct_connections:
|
||||
forward_open_msg[20:1] = [
|
||||
CONNECTION_SIZE['Direct Network'],
|
||||
]
|
||||
else:
|
||||
forward_open_msg[20:3] = [
|
||||
CONNECTION_SIZE['Backplane'],
|
||||
pack_usint(self.attribs['backplane']),
|
||||
pack_usint(self.attribs['cpu slot'])
|
||||
]
|
||||
|
||||
if self.send_rr_data(
|
||||
build_common_packet_format(DATA_ITEM['Unconnected'], ''.join(forward_open_msg), ADDRESS_ITEM['UCMM'],)):
|
||||
self._target_cid = self._reply[44:48]
|
||||
self._target_is_connected = True
|
||||
return True
|
||||
self._status = (4, "forward_open returned False")
|
||||
return False
|
||||
|
||||
def forward_close(self):
|
||||
""" CIP implementation of the forward close message
|
||||
|
||||
Each connection opened with the froward open message need to be closed.
|
||||
Refer to ODVA documentation Volume 1 3-5.5.3
|
||||
|
||||
:return: False if any error in the replayed message
|
||||
"""
|
||||
|
||||
if self._session == 0:
|
||||
self._status = (5, "A session need to be registered before to call forward_close.")
|
||||
raise CommError("A session need to be registered before to call forward_close.")
|
||||
|
||||
forward_close_msg = [
|
||||
FORWARD_CLOSE,
|
||||
pack_usint(2),
|
||||
CLASS_ID["8-bit"],
|
||||
CLASS_CODE["Connection Manager"], # Volume 1: 5-1
|
||||
INSTANCE_ID["8-bit"],
|
||||
CONNECTION_MANAGER_INSTANCE['Open Request'],
|
||||
PRIORITY,
|
||||
TIMEOUT_TICKS,
|
||||
self.attribs['csn'],
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
# CONNECTION_SIZE['Backplane'],
|
||||
# '\x00', # Reserved
|
||||
# pack_usint(self.attribs['backplane']),
|
||||
# pack_usint(self.attribs['cpu slot']),
|
||||
CLASS_ID["8-bit"],
|
||||
CLASS_CODE["Message Router"],
|
||||
INSTANCE_ID["8-bit"],
|
||||
pack_usint(1)
|
||||
]
|
||||
|
||||
if self.__direct_connections:
|
||||
forward_close_msg[11:2] = [
|
||||
CONNECTION_SIZE['Direct Network'],
|
||||
'\x00'
|
||||
]
|
||||
else:
|
||||
forward_close_msg[11:4] = [
|
||||
CONNECTION_SIZE['Backplane'],
|
||||
'\x00',
|
||||
pack_usint(self.attribs['backplane']),
|
||||
pack_usint(self.attribs['cpu slot'])
|
||||
]
|
||||
|
||||
if self.send_rr_data(
|
||||
build_common_packet_format(DATA_ITEM['Unconnected'], ''.join(forward_close_msg), ADDRESS_ITEM['UCMM'])):
|
||||
self._target_is_connected = False
|
||||
return True
|
||||
self._status = (5, "forward_close returned False")
|
||||
logger.warning(self._status)
|
||||
return False
|
||||
|
||||
def un_register_session(self):
|
||||
""" Un-register a connection
|
||||
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND['unregister_session'], 0)
|
||||
self._send()
|
||||
self._session = None
|
||||
|
||||
def _send(self):
|
||||
"""
|
||||
socket send
|
||||
:return: true if no error otherwise false
|
||||
"""
|
||||
try:
|
||||
logger.debug(print_bytes_msg(self._message, '-------------- SEND --------------'))
|
||||
self.__sock.send(self._message)
|
||||
except Exception as e:
|
||||
# self.clean_up()
|
||||
raise CommError(e)
|
||||
|
||||
def _receive(self):
|
||||
"""
|
||||
socket receive
|
||||
:return: true if no error otherwise false
|
||||
"""
|
||||
try:
|
||||
self._reply = self.__sock.receive()
|
||||
logger.debug(print_bytes_msg(self._reply, '----------- RECEIVE -----------'))
|
||||
except Exception as e:
|
||||
# self.clean_up()
|
||||
raise CommError(e)
|
||||
|
||||
def open(self, ip_address, direct_connection=False):
|
||||
"""
|
||||
socket open
|
||||
:param: ip address to connect to and type of connection. By default direct connection is disabled
|
||||
:return: true if no error otherwise false
|
||||
"""
|
||||
# set type of connection needed
|
||||
self.__direct_connections = direct_connection
|
||||
|
||||
# handle the socket layer
|
||||
if not self._connection_opened:
|
||||
try:
|
||||
if self.__sock is None:
|
||||
self.__sock = Socket()
|
||||
self.__sock.connect(ip_address, self.attribs['port'])
|
||||
self._connection_opened = True
|
||||
self.attribs['ip address'] = ip_address
|
||||
self.generate_cid()
|
||||
if self.register_session() is None:
|
||||
self._status = (13, "Session not registered")
|
||||
return False
|
||||
|
||||
# not sure but maybe I can remove this because is used to clean up any previous unclosed connection
|
||||
self.forward_close()
|
||||
return True
|
||||
except Exception as e:
|
||||
# self.clean_up()
|
||||
raise CommError(e)
|
||||
|
||||
def close(self):
|
||||
"""
|
||||
socket close
|
||||
:return: true if no error otherwise false
|
||||
"""
|
||||
try:
|
||||
if self._target_is_connected:
|
||||
self.forward_close()
|
||||
if self._session != 0:
|
||||
self.un_register_session()
|
||||
if self.__sock:
|
||||
self.__sock.close()
|
||||
except Exception as e:
|
||||
raise CommError(e)
|
||||
|
||||
self.clean_up()
|
||||
|
||||
def clean_up(self):
|
||||
self.__sock = None
|
||||
self._target_is_connected = False
|
||||
self._session = 0
|
||||
self._connection_opened = False
|
||||
|
||||
def is_connected(self):
|
||||
return self._connection_opened
|
||||
483
daq/pycomm-master/pycomm/cip/cip_const.py
Executable file
@@ -0,0 +1,483 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# cip_const.py - A set of structures and constants used to implement the Ethernet/IP protocol
|
||||
#
|
||||
#
|
||||
# Copyright (c) 2014 Agostino Ruscito <ruscito@gmail.com>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in all
|
||||
# copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
# SOFTWARE.
|
||||
#
|
||||
|
||||
ELEMENT_ID = {
|
||||
"8-bit": '\x28',
|
||||
"16-bit": '\x29',
|
||||
"32-bit": '\x2a'
|
||||
}
|
||||
|
||||
CLASS_ID = {
|
||||
"8-bit": '\x20',
|
||||
"16-bit": '\x21',
|
||||
}
|
||||
|
||||
INSTANCE_ID = {
|
||||
"8-bit": '\x24',
|
||||
"16-bit": '\x25'
|
||||
}
|
||||
|
||||
ATTRIBUTE_ID = {
|
||||
"8-bit": '\x30',
|
||||
"16-bit": '\x31'
|
||||
}
|
||||
|
||||
# Path are combined as:
|
||||
# CLASS_ID + PATHS
|
||||
# For example PCCC path is CLASS_ID["8-bit"]+PATH["PCCC"] -> 0x20, 0x67, 0x24, 0x01.
|
||||
PATH = {
|
||||
'Connection Manager': '\x06\x24\x01',
|
||||
'Router': '\x02\x24\x01',
|
||||
'Backplane Data Type': '\x66\x24\x01',
|
||||
'PCCC': '\x67\x24\x01',
|
||||
'DHCP Channel A': '\xa6\x24\x01\x01\x2c\x01',
|
||||
'DHCP Channel B': '\xa6\x24\x01\x02\x2c\x01'
|
||||
}
|
||||
|
||||
ENCAPSULATION_COMMAND = { # Volume 2: 2-3.2 Command Field UINT 2 byte
|
||||
"nop": '\x00\x00',
|
||||
"list_targets": '\x01\x00',
|
||||
"list_services": '\x04\x00',
|
||||
"list_identity": '\x63\x00',
|
||||
"list_interfaces": '\x64\x00',
|
||||
"register_session": '\x65\x00',
|
||||
"unregister_session": '\x66\x00',
|
||||
"send_rr_data": '\x6F\x00',
|
||||
"send_unit_data": '\x70\x00'
|
||||
}
|
||||
|
||||
"""
|
||||
When a tag is created, an instance of the Symbol Object (Class ID 0x6B) is created
|
||||
inside the controller.
|
||||
|
||||
When a UDT is created, an instance of the Template object (Class ID 0x6C) is
|
||||
created to hold information about the structure makeup.
|
||||
"""
|
||||
CLASS_CODE = {
|
||||
"Message Router": '\x02', # Volume 1: 5-1
|
||||
"Symbol Object": '\x6b',
|
||||
"Template Object": '\x6c',
|
||||
"Connection Manager": '\x06' # Volume 1: 3-5
|
||||
}
|
||||
|
||||
CONNECTION_MANAGER_INSTANCE = {
|
||||
'Open Request': '\x01',
|
||||
'Open Format Rejected': '\x02',
|
||||
'Open Resource Rejected': '\x03',
|
||||
'Open Other Rejected': '\x04',
|
||||
'Close Request': '\x05',
|
||||
'Close Format Request': '\x06',
|
||||
'Close Other Request': '\x07',
|
||||
'Connection Timeout': '\x08'
|
||||
}
|
||||
|
||||
TAG_SERVICES_REQUEST = {
|
||||
"Read Tag": 0x4c,
|
||||
"Read Tag Fragmented": 0x52,
|
||||
"Write Tag": 0x4d,
|
||||
"Write Tag Fragmented": 0x53,
|
||||
"Read Modify Write Tag": 0x4e,
|
||||
"Multiple Service Packet": 0x0a,
|
||||
"Get Instance Attributes List": 0x55,
|
||||
"Get Attributes": 0x03,
|
||||
"Read Template": 0x4c,
|
||||
}
|
||||
|
||||
TAG_SERVICES_REPLY = {
|
||||
0xcc: "Read Tag",
|
||||
0xd2: "Read Tag Fragmented",
|
||||
0xcd: "Write Tag",
|
||||
0xd3: "Write Tag Fragmented",
|
||||
0xce: "Read Modify Write Tag",
|
||||
0x8a: "Multiple Service Packet",
|
||||
0xd5: "Get Instance Attributes List",
|
||||
0x83: "Get Attributes",
|
||||
0xcc: "Read Template"
|
||||
}
|
||||
|
||||
|
||||
I_TAG_SERVICES_REPLY = {
|
||||
"Read Tag": 0xcc,
|
||||
"Read Tag Fragmented": 0xd2,
|
||||
"Write Tag": 0xcd,
|
||||
"Write Tag Fragmented": 0xd3,
|
||||
"Read Modify Write Tag": 0xce,
|
||||
"Multiple Service Packet": 0x8a,
|
||||
"Get Instance Attributes List": 0xd5,
|
||||
"Get Attributes": 0x83,
|
||||
"Read Template": 0xcc
|
||||
}
|
||||
|
||||
|
||||
"""
|
||||
EtherNet/IP Encapsulation Error Codes
|
||||
|
||||
Standard CIP Encapsulation Error returned in the cip message header
|
||||
"""
|
||||
STATUS = {
|
||||
0x0000: "Success",
|
||||
0x0001: "The sender issued an invalid or unsupported encapsulation command",
|
||||
0x0002: "Insufficient memory",
|
||||
0x0003: "Poorly formed or incorrect data in the data portion",
|
||||
0x0064: "An originator used an invalid session handle when sending an encapsulation message to the target",
|
||||
0x0065: "The target received a message of invalid length",
|
||||
0x0069: "Unsupported Protocol Version"
|
||||
}
|
||||
|
||||
"""
|
||||
MSG Error Codes:
|
||||
|
||||
The following error codes have been taken from:
|
||||
|
||||
Rockwell Automation Publication
|
||||
1756-RM003P-EN-P - December 2014
|
||||
"""
|
||||
SERVICE_STATUS = {
|
||||
0x01: "Connection failure (see extended status)",
|
||||
0x02: "Insufficient resource",
|
||||
0x03: "Invalid value",
|
||||
0x04: "IOI syntax error. A syntax error was detected decoding the Request Path (see extended status)",
|
||||
0x05: "Destination unknown, class unsupported, instance \nundefined or structure element undefined (see extended status)",
|
||||
0x06: "Insufficient Packet Space",
|
||||
0x07: "Connection lost",
|
||||
0x08: "Service not supported",
|
||||
0x09: "Error in data segment or invalid attribute value",
|
||||
0x0A: "Attribute list error",
|
||||
0x0B: "State already exist",
|
||||
0x0C: "Object state conflict",
|
||||
0x0D: "Object already exist",
|
||||
0x0E: "Attribute not settable",
|
||||
0x0F: "Permission denied",
|
||||
0x10: "Device state conflict",
|
||||
0x11: "Reply data too large",
|
||||
0x12: "Fragmentation of a primitive value",
|
||||
0x13: "Insufficient command data",
|
||||
0x14: "Attribute not supported",
|
||||
0x15: "Too much data",
|
||||
0x1A: "Bridge request too large",
|
||||
0x1B: "Bridge response too large",
|
||||
0x1C: "Attribute list shortage",
|
||||
0x1D: "Invalid attribute list",
|
||||
0x1E: "Request service error",
|
||||
0x1F: "Connection related failure (see extended status)",
|
||||
0x22: "Invalid reply received",
|
||||
0x25: "Key segment error",
|
||||
0x26: "Invalid IOI error",
|
||||
0x27: "Unexpected attribute in list",
|
||||
0x28: "DeviceNet error - invalid member ID",
|
||||
0x29: "DeviceNet error - member not settable",
|
||||
0xD1: "Module not in run state",
|
||||
0xFB: "Message port not supported",
|
||||
0xFC: "Message unsupported data type",
|
||||
0xFD: "Message uninitialized",
|
||||
0xFE: "Message timeout",
|
||||
0xff: "General Error (see extended status)"
|
||||
}
|
||||
|
||||
EXTEND_CODES = {
|
||||
0x01: {
|
||||
0x0100: "Connection in use",
|
||||
0x0103: "Transport not supported",
|
||||
0x0106: "Ownership conflict",
|
||||
0x0107: "Connection not found",
|
||||
0x0108: "Invalid connection type",
|
||||
0x0109: "Invalid connection size",
|
||||
0x0110: "Module not configured",
|
||||
0x0111: "EPR not supported",
|
||||
0x0114: "Wrong module",
|
||||
0x0115: "Wrong device type",
|
||||
0x0116: "Wrong revision",
|
||||
0x0118: "Invalid configuration format",
|
||||
0x011A: "Application out of connections",
|
||||
0x0203: "Connection timeout",
|
||||
0x0204: "Unconnected message timeout",
|
||||
0x0205: "Unconnected send parameter error",
|
||||
0x0206: "Message too large",
|
||||
0x0301: "No buffer memory",
|
||||
0x0302: "Bandwidth not available",
|
||||
0x0303: "No screeners available",
|
||||
0x0305: "Signature match",
|
||||
0x0311: "Port not available",
|
||||
0x0312: "Link address not available",
|
||||
0x0315: "Invalid segment type",
|
||||
0x0317: "Connection not scheduled"
|
||||
},
|
||||
0x04: {
|
||||
0x0000: "Extended status out of memory",
|
||||
0x0001: "Extended status out of instances"
|
||||
},
|
||||
0x05: {
|
||||
0x0000: "Extended status out of memory",
|
||||
0x0001: "Extended status out of instances"
|
||||
},
|
||||
0x1F: {
|
||||
0x0203: "Connection timeout"
|
||||
},
|
||||
0xff: {
|
||||
0x7: "Wrong data type",
|
||||
0x2001: "Excessive IOI",
|
||||
0x2002: "Bad parameter value",
|
||||
0x2018: "Semaphore reject",
|
||||
0x201B: "Size too small",
|
||||
0x201C: "Invalid size",
|
||||
0x2100: "Privilege failure",
|
||||
0x2101: "Invalid keyswitch position",
|
||||
0x2102: "Password invalid",
|
||||
0x2103: "No password issued",
|
||||
0x2104: "Address out of range",
|
||||
0x2105: "Address and how many out of range",
|
||||
0x2106: "Data in use",
|
||||
0x2107: "Type is invalid or not supported",
|
||||
0x2108: "Controller in upload or download mode",
|
||||
0x2109: "Attempt to change number of array dimensions",
|
||||
0x210A: "Invalid symbol name",
|
||||
0x210B: "Symbol does not exist",
|
||||
0x210E: "Search failed",
|
||||
0x210F: "Task cannot start",
|
||||
0x2110: "Unable to write",
|
||||
0x2111: "Unable to read",
|
||||
0x2112: "Shared routine not editable",
|
||||
0x2113: "Controller in faulted mode",
|
||||
0x2114: "Run mode inhibited"
|
||||
|
||||
}
|
||||
}
|
||||
DATA_ITEM = {
|
||||
'Connected': '\xb1\x00',
|
||||
'Unconnected': '\xb2\x00'
|
||||
}
|
||||
|
||||
ADDRESS_ITEM = {
|
||||
'Connection Based': '\xa1\x00',
|
||||
'Null': '\x00\x00',
|
||||
'UCMM': '\x00\x00'
|
||||
}
|
||||
|
||||
UCMM = {
|
||||
'Interface Handle': 0,
|
||||
'Item Count': 2,
|
||||
'Address Type ID': 0,
|
||||
'Address Length': 0,
|
||||
'Data Type ID': 0x00b2
|
||||
}
|
||||
|
||||
CONNECTION_SIZE = {
|
||||
'Backplane': '\x03', # CLX
|
||||
'Direct Network': '\x02'
|
||||
}
|
||||
|
||||
HEADER_SIZE = 24
|
||||
EXTENDED_SYMBOL = '\x91'
|
||||
BOOL_ONE = 0xff
|
||||
REQUEST_SERVICE = 0
|
||||
REQUEST_PATH_SIZE = 1
|
||||
REQUEST_PATH = 2
|
||||
SUCCESS = 0
|
||||
INSUFFICIENT_PACKETS = 6
|
||||
OFFSET_MESSAGE_REQUEST = 40
|
||||
|
||||
|
||||
FORWARD_CLOSE = '\x4e'
|
||||
UNCONNECTED_SEND = '\x52'
|
||||
FORWARD_OPEN = '\x54'
|
||||
LARGE_FORWARD_OPEN = '\x5b'
|
||||
GET_CONNECTION_DATA = '\x56'
|
||||
SEARCH_CONNECTION_DATA = '\x57'
|
||||
GET_CONNECTION_OWNER = '\x5a'
|
||||
MR_SERVICE_SIZE = 2
|
||||
|
||||
PADDING_BYTE = '\x00'
|
||||
PRIORITY = '\x0a'
|
||||
TIMEOUT_TICKS = '\x05'
|
||||
TIMEOUT_MULTIPLIER = '\x01'
|
||||
TRANSPORT_CLASS = '\xa3'
|
||||
|
||||
CONNECTION_PARAMETER = {
|
||||
'PLC5': 0x4302,
|
||||
'SLC500': 0x4302,
|
||||
'CNET': 0x4320,
|
||||
'DHP': 0x4302,
|
||||
'Default': 0x43f8,
|
||||
}
|
||||
|
||||
"""
|
||||
Atomic Data Type:
|
||||
|
||||
Bit = Bool
|
||||
Bit array = DWORD (32-bit boolean aray)
|
||||
8-bit integer = SINT
|
||||
16-bit integer = UINT
|
||||
32-bit integer = DINT
|
||||
32-bit float = REAL
|
||||
64-bit integer = LINT
|
||||
|
||||
From Rockwell Automation Publication 1756-PM020C-EN-P November 2012:
|
||||
When reading a BOOL tag, the values returned for 0 and 1 are 0 and 0xff, respectively.
|
||||
"""
|
||||
|
||||
S_DATA_TYPE = {
|
||||
'BOOL': 0xc1,
|
||||
'SINT': 0xc2, # Signed 8-bit integer
|
||||
'INT': 0xc3, # Signed 16-bit integer
|
||||
'DINT': 0xc4, # Signed 32-bit integer
|
||||
'LINT': 0xc5, # Signed 64-bit integer
|
||||
'USINT': 0xc6, # Unsigned 8-bit integer
|
||||
'UINT': 0xc7, # Unsigned 16-bit integer
|
||||
'UDINT': 0xc8, # Unsigned 32-bit integer
|
||||
'ULINT': 0xc9, # Unsigned 64-bit integer
|
||||
'REAL': 0xca, # 32-bit floating point
|
||||
'LREAL': 0xcb, # 64-bit floating point
|
||||
'STIME': 0xcc, # Synchronous time
|
||||
'DATE': 0xcd,
|
||||
'TIME_OF_DAY': 0xce,
|
||||
'DATE_AND_TIME': 0xcf,
|
||||
'STRING': 0xd0, # character string (1 byte per character)
|
||||
'BYTE': 0xd1, # byte string 8-bits
|
||||
'WORD': 0xd2, # byte string 16-bits
|
||||
'DWORD': 0xd3, # byte string 32-bits
|
||||
'LWORD': 0xd4, # byte string 64-bits
|
||||
'STRING2': 0xd5, # character string (2 byte per character)
|
||||
'FTIME': 0xd6, # Duration high resolution
|
||||
'LTIME': 0xd7, # Duration long
|
||||
'ITIME': 0xd8, # Duration short
|
||||
'STRINGN': 0xd9, # character string (n byte per character)
|
||||
'SHORT_STRING': 0xda, # character string (1 byte per character, 1 byte length indicator)
|
||||
'TIME': 0xdb, # Duration in milliseconds
|
||||
'EPATH': 0xdc, # CIP Path segment
|
||||
'ENGUNIT': 0xdd, # Engineering Units
|
||||
'STRINGI': 0xde # International character string
|
||||
}
|
||||
|
||||
I_DATA_TYPE = {
|
||||
0xc1: 'BOOL',
|
||||
0xc2: 'SINT', # Signed 8-bit integer
|
||||
0xc3: 'INT', # Signed 16-bit integer
|
||||
0xc4: 'DINT', # Signed 32-bit integer
|
||||
0xc5: 'LINT', # Signed 64-bit integer
|
||||
0xc6: 'USINT', # Unsigned 8-bit integer
|
||||
0xc7: 'UINT', # Unsigned 16-bit integer
|
||||
0xc8: 'UDINT', # Unsigned 32-bit integer
|
||||
0xc9: 'ULINT', # Unsigned 64-bit integer
|
||||
0xca: 'REAL', # 32-bit floating point
|
||||
0xcb: 'LREAL', # 64-bit floating point
|
||||
0xcc: 'STIME', # Synchronous time
|
||||
0xcd: 'DATE',
|
||||
0xce: 'TIME_OF_DAY',
|
||||
0xcf: 'DATE_AND_TIME',
|
||||
0xd0: 'STRING', # character string (1 byte per character)
|
||||
0xd1: 'BYTE', # byte string 8-bits
|
||||
0xd2: 'WORD', # byte string 16-bits
|
||||
0xd3: 'DWORD', # byte string 32-bits
|
||||
0xd4: 'LWORD', # byte string 64-bits
|
||||
0xd5: 'STRING2', # character string (2 byte per character)
|
||||
0xd6: 'FTIME', # Duration high resolution
|
||||
0xd7: 'LTIME', # Duration long
|
||||
0xd8: 'ITIME', # Duration short
|
||||
0xd9: 'STRINGN', # character string (n byte per character)
|
||||
0xda: 'SHORT_STRING', # character string (1 byte per character, 1 byte length indicator)
|
||||
0xdb: 'TIME', # Duration in milliseconds
|
||||
0xdc: 'EPATH', # CIP Path segment
|
||||
0xdd: 'ENGUNIT', # Engineering Units
|
||||
0xde: 'STRINGI' # International character string
|
||||
}
|
||||
|
||||
REPLAY_INFO = {
|
||||
0x4e: 'FORWARD_CLOSE (4E,00)',
|
||||
0x52: 'UNCONNECTED_SEND (52,00)',
|
||||
0x54: 'FORWARD_OPEN (54,00)',
|
||||
0x6f: 'send_rr_data (6F,00)',
|
||||
0x70: 'send_unit_data (70,00)',
|
||||
0x00: 'nop',
|
||||
0x01: 'list_targets',
|
||||
0x04: 'list_services',
|
||||
0x63: 'list_identity',
|
||||
0x64: 'list_interfaces',
|
||||
0x65: 'register_session',
|
||||
0x66: 'unregister_session',
|
||||
}
|
||||
|
||||
PCCC_DATA_TYPE = {
|
||||
'N': '\x89',
|
||||
'B': '\x85',
|
||||
'T': '\x86',
|
||||
'C': '\x87',
|
||||
'S': '\x84',
|
||||
'F': '\x8a',
|
||||
'ST': '\x8d',
|
||||
'A': '\x8e',
|
||||
'R': '\x88',
|
||||
'O': '\x8b',
|
||||
'I': '\x8c'
|
||||
}
|
||||
|
||||
PCCC_DATA_SIZE = {
|
||||
'N': 2,
|
||||
# 'L': 4,
|
||||
'B': 2,
|
||||
'T': 6,
|
||||
'C': 6,
|
||||
'S': 2,
|
||||
'F': 4,
|
||||
'ST': 84,
|
||||
'A': 2,
|
||||
'R': 6,
|
||||
'O': 2,
|
||||
'I': 2
|
||||
}
|
||||
|
||||
PCCC_CT = {
|
||||
'PRE': 1,
|
||||
'ACC': 2,
|
||||
'EN': 15,
|
||||
'TT': 14,
|
||||
'DN': 13,
|
||||
'CU': 15,
|
||||
'CD': 14,
|
||||
'OV': 12,
|
||||
'UN': 11,
|
||||
'UA': 10
|
||||
}
|
||||
|
||||
PCCC_ERROR_CODE = {
|
||||
-2: "Not Acknowledged (NAK)",
|
||||
-3: "No Reponse, Check COM Settings",
|
||||
-4: "Unknown Message from DataLink Layer",
|
||||
-5: "Invalid Address",
|
||||
-6: "Could Not Open Com Port",
|
||||
-7: "No data specified to data link layer",
|
||||
-8: "No data returned from PLC",
|
||||
-20: "No Data Returned",
|
||||
16: "Illegal Command or Format, Address may not exist or not enough elements in data file",
|
||||
32: "PLC Has a Problem and Will Not Communicate",
|
||||
48: "Remote Node Host is Missing, Disconnected, or Shut Down",
|
||||
64: "Host Could Not Complete Function Due To Hardware Fault",
|
||||
80: "Addressing problem or Memory Protect Rungs",
|
||||
96: "Function not allows due to command protection selection",
|
||||
112: "Processor is in Program mode",
|
||||
128: "Compatibility mode file missing or communication zone problem",
|
||||
144: "Remote node cannot buffer command",
|
||||
240: "Error code in EXT STS Byte"
|
||||
}
|
||||
8
daq/pycomm-master/pycomm/common.py
Executable file
@@ -0,0 +1,8 @@
|
||||
__author__ = 'Agostino Ruscito'
|
||||
__version__ = "1.0.8"
|
||||
__date__ = "08 03 2015"
|
||||
|
||||
class PycommError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
37
daq/pycomm-master/setup.py
Executable file
@@ -0,0 +1,37 @@
|
||||
from distutils.core import setup
|
||||
from pycomm import common
|
||||
import os
|
||||
|
||||
|
||||
def read(file_name):
|
||||
return open(os.path.join(os.path.dirname(__file__), file_name)).read()
|
||||
|
||||
setup(
|
||||
name="pycomm",
|
||||
author="Agostino Ruscito",
|
||||
author_email="uscito@gmail.com",
|
||||
version=common.__version__,
|
||||
description="A PLC communication library for Python",
|
||||
long_description=read('README.rst'),
|
||||
license="MIT",
|
||||
url="https://github.com/ruscito/pycomm",
|
||||
packages=[
|
||||
"pycomm",
|
||||
"pycomm.ab_comm",
|
||||
"pycomm.cip"
|
||||
],
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Intended Audience :: Developers',
|
||||
'Natural Language :: English',
|
||||
'License :: OSI Approved :: MIT License',
|
||||
'Operating System :: OS Independent',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 2',
|
||||
'Programming Language :: Python :: 2.6',
|
||||
'Programming Language :: Python :: 2.7',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.3',
|
||||
'Topic :: Software Development :: Libraries :: Python Modules',
|
||||
],
|
||||
)
|
||||
1
daq/pycomm_helper
Submodule
@@ -1,20 +1,21 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
'''
|
||||
MySQL Tag Server
|
||||
Tag Logger
|
||||
Created on April 7, 2016
|
||||
@author: Patrick McDonagh
|
||||
@description: Continuously loops through a list of tags to store values from a PLC into a MySQL database
|
||||
@description: Continuously loops through a list of tags to store values from a PLC
|
||||
'''
|
||||
|
||||
from tag.tag import Tag
|
||||
import traceback
|
||||
import time
|
||||
import requests
|
||||
import json
|
||||
import requests
|
||||
from pycomm_helper.tag import Tag
|
||||
|
||||
# DEFAULTS
|
||||
web_address = "https://localhost:3000"
|
||||
db_address = "10.10.10.10:3000"
|
||||
db_url = "https://{}".format(db_address)
|
||||
scan_rate = 30 # seconds
|
||||
save_all = "test" # use True, False, or any string
|
||||
plc_handshake_tags = {}
|
||||
@@ -25,13 +26,13 @@ device_types = {}
|
||||
|
||||
|
||||
def main():
|
||||
global web_address, scan_rate, save_all, tag_store, device_types, plc_handshake_tags, last_handshake_time
|
||||
global db_address, scan_rate, save_all, tag_store, device_types, plc_handshake_tags, last_handshake_time
|
||||
try:
|
||||
# Get tags stored in database
|
||||
get_tag_request_data = {'where': '{"tag_class": 5}'}
|
||||
get_tag_request = requests.get('{}/tag'.format(web_address), params=get_tag_request_data, verify=False)
|
||||
get_tag_request = requests.get('{}/tag'.format(db_url), params=get_tag_request_data, verify=False)
|
||||
tags = json.loads(get_tag_request.text)
|
||||
except Exception, e:
|
||||
except Exception as e:
|
||||
print("Error getting tags: {}".format(e))
|
||||
time.sleep(10)
|
||||
main()
|
||||
@@ -39,52 +40,52 @@ def main():
|
||||
try:
|
||||
# Get tags stored in database
|
||||
|
||||
get_device_type_request = requests.get('{}/device_type'.format(web_address), verify=False)
|
||||
get_device_type_request = requests.get('{}/device_type'.format(db_url), verify=False)
|
||||
device_types_json = json.loads(get_device_type_request.text)
|
||||
for t in device_types_json:
|
||||
device_types[t['id']] = t['dType']
|
||||
except Exception, e:
|
||||
except Exception as e:
|
||||
print("Error getting tags: {}".format(e))
|
||||
time.sleep(10)
|
||||
main()
|
||||
|
||||
try:
|
||||
sr_req_data = 'where={"parameter": "scan_rate"}'
|
||||
sr_req = requests.get('{}/config?{}'.format(web_address, sr_req_data), verify=False)
|
||||
sr_req = requests.get('{}/config?{}'.format(db_url, sr_req_data), verify=False)
|
||||
sr_try = json.loads(sr_req.text)
|
||||
if len(sr_try) > 0:
|
||||
scan_rate = int(sr_try[0]['val'])
|
||||
except Exception, e:
|
||||
except Exception as e:
|
||||
print("Error getting scan rate: {}".format(e))
|
||||
print("I'll just use {} seconds as the scan rate...".format(scan_rate))
|
||||
|
||||
try:
|
||||
sa_req_data = {"where": {"parameter": "save_all"}}
|
||||
sa_req = requests.get('{}/config'.format(web_address), params=sa_req_data, verify=False)
|
||||
sa_req = requests.get('{}/config'.format(db_url), params=sa_req_data, verify=False)
|
||||
sa_try = json.loads(sa_req.text)
|
||||
if len(sa_try) > 0:
|
||||
if sa_try[0]['val'].lower() == "true":
|
||||
save_all = True
|
||||
elif sa_try[0]['val'].lower() == "false":
|
||||
save_all = False
|
||||
except Exception, e:
|
||||
except Exception as e:
|
||||
print("Error getting save-all: {}".format(e))
|
||||
print("I'll just use {} as the save-all parameter...".format(save_all))
|
||||
|
||||
try:
|
||||
# Get tags stored in database
|
||||
get_hs_request_data = {'where': '{"tag_class": 6}'}
|
||||
get_hs_request = requests.get('{}/tag'.format(web_address), params=get_hs_request_data, verify=False)
|
||||
get_hs_request = requests.get('{}/tag'.format(db_url), params=get_hs_request_data, verify=False)
|
||||
hs_tags = json.loads(get_hs_request.text)
|
||||
if len(hs_tags) > 0:
|
||||
for hs in hs_tags:
|
||||
plc_handshake_tags[hs['name']] = Tag(hs['name'], hs['tag'], hs['id'], hs['data_type'], hs['change_threshold'], hs['guarantee_sec'], mapFn=hs['map_function'], ip_address=hs['deviceID']['address'], device_type=device_types[hs['deviceID']['device_type']])
|
||||
except Exception, e:
|
||||
except Exception as e:
|
||||
print("Error getting handshake tags: {}".format(e))
|
||||
|
||||
for t in tags:
|
||||
# name, tag, db_id, data_type, change_threshold, guarantee_sec, mapFn=None, device_type='CLX', ip_address='192.168.1.10'):
|
||||
tag_store[t['name']] = Tag(t['name'], t['tag'], t['id'], t['data_type'], t['change_threshold'], t['guarantee_sec'], mapFn=t['map_function'], ip_address=t['deviceID']['address'], device_type=device_types[t['deviceID']['device_type']])
|
||||
tag_store[t['name']] = Tag(t['name'], t['tag'], t['id'], t['data_type'], t['change_threshold'], t['guarantee_sec'], mapFn=t['map_function'], ip_address=t['deviceID']['address'], device_type=device_types[t['deviceID']['device_type']], db_address=db_address)
|
||||
|
||||
while True:
|
||||
for tag in tag_store:
|
||||
14
daq_sample/Dockerfile.rpi
Normal file
@@ -0,0 +1,14 @@
|
||||
FROM patrickjmcd/rpi-python3:latest
|
||||
|
||||
# Copy source files
|
||||
RUN mkdir /root/tag-logger
|
||||
COPY sampleData.py /root/tag-logger/sampleData.py
|
||||
COPY pycomm-master /tmp/pycomm
|
||||
COPY pycomm_helper /tmp/pycomm_helper
|
||||
|
||||
# Install some python packages
|
||||
RUN pip install requests
|
||||
RUN cd /tmp/pycomm && python setup.py install && cd /
|
||||
RUN cd /tmp/pycomm_helper && python setup.py install && cd /
|
||||
|
||||
CMD ["python", "/root/tag-logger/sampleData.py"]
|
||||
14
daq_sample/Dockerfile.ubuntu
Normal file
@@ -0,0 +1,14 @@
|
||||
FROM python:latest
|
||||
|
||||
# Copy source files
|
||||
RUN mkdir /root/tag-logger
|
||||
COPY sampleData.py /root/tag-logger/sampleData.py
|
||||
COPY pycomm-master /tmp/pycomm
|
||||
COPY pycomm_helper /tmp/pycomm_helper
|
||||
|
||||
# Install some python packages
|
||||
RUN pip install requests
|
||||
RUN cd /tmp/pycomm && python setup.py install && cd /
|
||||
RUN cd /tmp/pycomm_helper && python setup.py install && cd /
|
||||
|
||||
CMD ["python", "/root/tag-logger/sampleData.py"]
|
||||
12
daq_sample/pycomm-master/.travis.yml
Executable file
@@ -0,0 +1,12 @@
|
||||
language: python
|
||||
|
||||
python:
|
||||
- "2.6"
|
||||
- "2.7"
|
||||
- "3.2"
|
||||
- "3.3"
|
||||
- "3.4"
|
||||
|
||||
install: python setup.py install
|
||||
|
||||
script: nosetests
|
||||
39
daq_sample/pycomm-master/CHANGES
Executable file
@@ -0,0 +1,39 @@
|
||||
CHANGES
|
||||
=======
|
||||
|
||||
1.0.8
|
||||
-----
|
||||
Number 0001:
|
||||
handling of raw values (hex) added to functions read_array and write_array: handling of raw values can be switched
|
||||
on/off with additional parameter
|
||||
|
||||
Number 0002:
|
||||
is a bugfix when reading the tag_list from a PLC. If one tag is of datatype bool and it is part of a bool
|
||||
array within an SINT, the tag type value contains also the bit position.
|
||||
|
||||
Number 0003:
|
||||
code is always logging into a file (pycomm.log) into working path. Code changed, so that it is possible to configure
|
||||
the logging from the main application.
|
||||
|
||||
|
||||
|
||||
1.0.6
|
||||
-----
|
||||
|
||||
- Pypi posting
|
||||
|
||||
1.0.0
|
||||
-----
|
||||
|
||||
- Add support for SLC and PLC/05 plc
|
||||
|
||||
0.2.0
|
||||
---
|
||||
|
||||
- Add CIP support class
|
||||
- Add support for ControlLogix PLC
|
||||
|
||||
0.1
|
||||
---
|
||||
|
||||
- Initial release.
|
||||
22
daq_sample/pycomm-master/LICENSE
Executable file
@@ -0,0 +1,22 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014 Agostino Ruscito
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
1
daq_sample/pycomm-master/MANIFEST.in
Executable file
@@ -0,0 +1 @@
|
||||
include README.rst
|
||||
171
daq_sample/pycomm-master/README.rst
Executable file
@@ -0,0 +1,171 @@
|
||||
pycomm
|
||||
======
|
||||
pycomm is a package that includes a collection of modules used to communicate with PLCs.
|
||||
At the moment the first module in the package is ab_comm.
|
||||
|
||||
Test
|
||||
~~~~
|
||||
The library is currently test on Python 2.6, 2.7.
|
||||
|
||||
.. image:: https://travis-ci.org/ruscito/pycomm.svg?branch=master
|
||||
:target: https://travis-ci.org/ruscito/pycomm
|
||||
|
||||
Setup
|
||||
~~~~~
|
||||
The package can be installed from
|
||||
|
||||
GitHub:
|
||||
::
|
||||
|
||||
git clone https://github.com/ruscito/pycomm.git
|
||||
cd pycomm
|
||||
sudo python setup.py install
|
||||
|
||||
|
||||
PyPi:
|
||||
::
|
||||
pip install pycomm
|
||||
|
||||
ab_comm
|
||||
~~~~~~~
|
||||
ab_comm is a module that contains a set of classes used to interface Rockwell PLCs using Ethernet/IP protocol.
|
||||
The "clx" class can be used to communicate with Compactlogix, Controllogix PLCs
|
||||
The "slc" can be used to communicate with Micrologix or SLC PLCs
|
||||
|
||||
I tried to followCIP specifications volume 1 and 2 as well as `Rockwell Automation Publication 1756-PM020-EN-P - November 2012`_ .
|
||||
|
||||
.. _Rockwell Automation Publication 1756-PM020-EN-P - November 2012: http://literature.rockwellautomation.com/idc/groups/literature/documents/pm/1756-pm020_-en-p.pdf
|
||||
|
||||
See the following snippet for communication with a Controllogix PLC:
|
||||
|
||||
::
|
||||
|
||||
from pycomm.ab_comm.clx import Driver as ClxDriver
|
||||
import logging
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
logging.basicConfig(
|
||||
filename="ClxDriver.log",
|
||||
format="%(levelname)-10s %(asctime)s %(message)s",
|
||||
level=logging.DEBUG
|
||||
)
|
||||
c = ClxDriver()
|
||||
|
||||
if c.open('172.16.2.161'):
|
||||
|
||||
print(c.read_tag(['ControlWord']))
|
||||
print(c.read_tag(['parts', 'ControlWord', 'Counts']))
|
||||
|
||||
print(c.write_tag('Counts', -26, 'INT'))
|
||||
print(c.write_tag(('Counts', 26, 'INT')))
|
||||
print(c.write_tag([('Counts', 26, 'INT')]))
|
||||
print(c.write_tag([('Counts', -26, 'INT'), ('ControlWord', -30, 'DINT'), ('parts', 31, 'DINT')]))
|
||||
|
||||
# To read an array
|
||||
r_array = c.read_array("TotalCount", 1750)
|
||||
for tag in r_array:
|
||||
print (tag)
|
||||
|
||||
# reset tha array to all 0
|
||||
w_array = []
|
||||
for i in xrange(1750):
|
||||
w_array.append(0)
|
||||
c.write_array("TotalCount", "SINT", w_array)
|
||||
|
||||
c.close()
|
||||
|
||||
|
||||
|
||||
|
||||
See the following snippet for communication with a Micrologix PLC:
|
||||
|
||||
|
||||
::
|
||||
|
||||
from pycomm.ab_comm.slc import Driver as SlcDriver
|
||||
import logging
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
logging.basicConfig(
|
||||
filename="SlcDriver.log",
|
||||
format="%(levelname)-10s %(asctime)s %(message)s",
|
||||
level=logging.DEBUG
|
||||
)
|
||||
c = SlcDriver()
|
||||
if c.open('172.16.2.160'):
|
||||
|
||||
print c.read_tag('S:1/5')
|
||||
print c.read_tag('S:60', 2)
|
||||
|
||||
print c.write_tag('N7:0', [-30, 32767, -32767])
|
||||
print c.write_tag('N7:0', 21)
|
||||
print c.read_tag('N7:0', 10)
|
||||
|
||||
print c.write_tag('F8:0', [3.1, 4.95, -32.89])
|
||||
print c.write_tag('F8:0', 21)
|
||||
print c.read_tag('F8:0', 3)
|
||||
|
||||
print c.write_tag('B3:100', [23, -1, 4, 9])
|
||||
print c.write_tag('B3:100', 21)
|
||||
print c.read_tag('B3:100', 4)
|
||||
|
||||
print c.write_tag('T4:3.PRE', 431)
|
||||
print c.read_tag('T4:3.PRE')
|
||||
print c.write_tag('C5:0.PRE', 501)
|
||||
print c.read_tag('C5:0.PRE')
|
||||
print c.write_tag('T4:3.ACC', 432)
|
||||
print c.read_tag('T4:3.ACC')
|
||||
print c.write_tag('C5:0.ACC', 502)
|
||||
print c.read_tag('C5:0.ACC')
|
||||
|
||||
c.write_tag('T4:2.EN', 0)
|
||||
c.write_tag('T4:2.TT', 0)
|
||||
c.write_tag('T4:2.DN', 0)
|
||||
print c.read_tag('T4:2.EN', 1)
|
||||
print c.read_tag('T4:2.TT', 1)
|
||||
print c.read_tag('T4:2.DN',)
|
||||
|
||||
c.write_tag('C5:0.CU', 1)
|
||||
c.write_tag('C5:0.CD', 0)
|
||||
c.write_tag('C5:0.DN', 1)
|
||||
c.write_tag('C5:0.OV', 0)
|
||||
c.write_tag('C5:0.UN', 1)
|
||||
c.write_tag('C5:0.UA', 0)
|
||||
print c.read_tag('C5:0.CU')
|
||||
print c.read_tag('C5:0.CD')
|
||||
print c.read_tag('C5:0.DN')
|
||||
print c.read_tag('C5:0.OV')
|
||||
print c.read_tag('C5:0.UN')
|
||||
print c.read_tag('C5:0.UA')
|
||||
|
||||
c.write_tag('B3:100', 1)
|
||||
print c.read_tag('B3:100')
|
||||
|
||||
c.write_tag('B3/3955', 1)
|
||||
print c.read_tag('B3/3955')
|
||||
|
||||
c.write_tag('N7:0/2', 1)
|
||||
print c.read_tag('N7:0/2')
|
||||
|
||||
print c.write_tag('O:0.0/4', 1)
|
||||
print c.read_tag('O:0.0/4')
|
||||
|
||||
c.close()
|
||||
|
||||
|
||||
The Future
|
||||
~~~~~~~~~~
|
||||
This package is under development.
|
||||
The modules _ab_comm.clx_ and _ab_comm.slc_ are completed at moment but other drivers will be added in the future.
|
||||
|
||||
Thanks
|
||||
~~~~~~
|
||||
Thanks to patrickjmcd_ for the help with the Direct Connections and thanks in advance to anyone for feedback and suggestions.
|
||||
|
||||
.. _patrickjmcd: https://github.com/patrickjmcd
|
||||
|
||||
License
|
||||
~~~~~~~
|
||||
pycomm is distributed under the MIT License
|
||||
42
daq_sample/pycomm-master/examples/test_clx_comm.py
Executable file
@@ -0,0 +1,42 @@
|
||||
from pycomm.ab_comm.clx import Driver as ClxDriver
|
||||
import logging
|
||||
|
||||
from time import sleep
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
logging.basicConfig(
|
||||
filename="ClxDriver.log",
|
||||
format="%(levelname)-10s %(asctime)s %(message)s",
|
||||
level=logging.DEBUG
|
||||
)
|
||||
c = ClxDriver()
|
||||
|
||||
print c['port']
|
||||
print c.__version__
|
||||
|
||||
|
||||
if c.open('172.16.2.161'):
|
||||
while 1:
|
||||
try:
|
||||
print(c.read_tag(['ControlWord']))
|
||||
print(c.read_tag(['parts', 'ControlWord', 'Counts']))
|
||||
|
||||
print(c.write_tag('Counts', -26, 'INT'))
|
||||
print(c.write_tag(('Counts', 26, 'INT')))
|
||||
print(c.write_tag([('Counts', 26, 'INT')]))
|
||||
print(c.write_tag([('Counts', -26, 'INT'), ('ControlWord', -30, 'DINT'), ('parts', 31, 'DINT')]))
|
||||
sleep(1)
|
||||
except Exception as e:
|
||||
err = c.get_status()
|
||||
c.close()
|
||||
print err
|
||||
pass
|
||||
|
||||
# To read an array
|
||||
r_array = c.read_array("TotalCount", 1750)
|
||||
for tag in r_array:
|
||||
print (tag)
|
||||
|
||||
c.close()
|
||||
72
daq_sample/pycomm-master/examples/test_slc_only.py
Executable file
@@ -0,0 +1,72 @@
|
||||
__author__ = 'agostino'
|
||||
|
||||
from pycomm.ab_comm.slc import Driver as SlcDriver
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
c = SlcDriver(True, 'delete_slc.log')
|
||||
if c.open('172.16.2.160'):
|
||||
|
||||
while 1:
|
||||
try:
|
||||
print c.read_tag('S:1/5')
|
||||
print c.read_tag('S:60', 2)
|
||||
|
||||
print c.write_tag('N7:0', [-30, 32767, -32767])
|
||||
print c.write_tag('N7:0', 21)
|
||||
print c.read_tag('N7:0', 10)
|
||||
|
||||
print c.write_tag('F8:0', [3.1, 4.95, -32.89])
|
||||
print c.write_tag('F8:0', 21)
|
||||
print c.read_tag('F8:0', 3)
|
||||
|
||||
print c.write_tag('B3:100', [23, -1, 4, 9])
|
||||
print c.write_tag('B3:100', 21)
|
||||
print c.read_tag('B3:100', 4)
|
||||
|
||||
print c.write_tag('T4:3.PRE', 431)
|
||||
print c.read_tag('T4:3.PRE')
|
||||
print c.write_tag('C5:0.PRE', 501)
|
||||
print c.read_tag('C5:0.PRE')
|
||||
print c.write_tag('T4:3.ACC', 432)
|
||||
print c.read_tag('T4:3.ACC')
|
||||
print c.write_tag('C5:0.ACC', 502)
|
||||
print c.read_tag('C5:0.ACC')
|
||||
|
||||
c.write_tag('T4:2.EN', 0)
|
||||
c.write_tag('T4:2.TT', 0)
|
||||
c.write_tag('T4:2.DN', 0)
|
||||
print c.read_tag('T4:2.EN', 1)
|
||||
print c.read_tag('T4:2.TT', 1)
|
||||
print c.read_tag('T4:2.DN',)
|
||||
|
||||
c.write_tag('C5:0.CU', 1)
|
||||
c.write_tag('C5:0.CD', 0)
|
||||
c.write_tag('C5:0.DN', 1)
|
||||
c.write_tag('C5:0.OV', 0)
|
||||
c.write_tag('C5:0.UN', 1)
|
||||
c.write_tag('C5:0.UA', 0)
|
||||
print c.read_tag('C5:0.CU')
|
||||
print c.read_tag('C5:0.CD')
|
||||
print c.read_tag('C5:0.DN')
|
||||
print c.read_tag('C5:0.OV')
|
||||
print c.read_tag('C5:0.UN')
|
||||
print c.read_tag('C5:0.UA')
|
||||
|
||||
c.write_tag('B3:100', 1)
|
||||
print c.read_tag('B3:100')
|
||||
|
||||
c.write_tag('B3/3955', 1)
|
||||
print c.read_tag('B3/3955')
|
||||
|
||||
c.write_tag('N7:0/2', 1)
|
||||
print c.read_tag('N7:0/2')
|
||||
|
||||
print c.write_tag('O:0.0/4', 1)
|
||||
print c.read_tag('O:0.0/4')
|
||||
except Exception as e:
|
||||
err = c.get_status()
|
||||
#c.close()
|
||||
print err
|
||||
pass
|
||||
c.close()
|
||||
1
daq_sample/pycomm-master/pycomm/__init__.py
Executable file
@@ -0,0 +1 @@
|
||||
__author__ = 'agostino'
|
||||
2
daq_sample/pycomm-master/pycomm/ab_comm/__init__.py
Executable file
@@ -0,0 +1,2 @@
|
||||
__author__ = 'agostino'
|
||||
import logging
|
||||
873
daq_sample/pycomm-master/pycomm/ab_comm/clx.py
Executable file
@@ -0,0 +1,873 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# clx.py - Ethernet/IP Client for Rockwell PLCs
|
||||
#
|
||||
#
|
||||
# Copyright (c) 2014 Agostino Ruscito <ruscito@gmail.com>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in all
|
||||
# copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
# SOFTWARE.
|
||||
#
|
||||
from pycomm.cip.cip_base import *
|
||||
import logging
|
||||
try: # Python 2.7+
|
||||
from logging import NullHandler
|
||||
except ImportError:
|
||||
class NullHandler(logging.Handler):
|
||||
def emit(self, record):
|
||||
pass
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.addHandler(NullHandler())
|
||||
|
||||
|
||||
class Driver(Base):
|
||||
"""
|
||||
This Ethernet/IP client is based on Rockwell specification. Please refer to the link below for details.
|
||||
|
||||
http://literature.rockwellautomation.com/idc/groups/literature/documents/pm/1756-pm020_-en-p.pdf
|
||||
|
||||
The following services have been implemented:
|
||||
- Read Tag Service (0x4c)
|
||||
- Read Tag Fragment Service (0x52)
|
||||
- Write Tag Service (0x4d)
|
||||
- Write Tag Fragment Service (0x53)
|
||||
- Multiple Service Packet (0x0a)
|
||||
|
||||
The client has been successfully tested with the following PLCs:
|
||||
- CompactLogix 5330ERM
|
||||
- CompactLogix 5370
|
||||
- ControlLogix 5572 and 1756-EN2T Module
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
super(Driver, self).__init__()
|
||||
|
||||
self._buffer = {}
|
||||
self._get_template_in_progress = False
|
||||
self.__version__ = '0.2'
|
||||
|
||||
def get_last_tag_read(self):
|
||||
""" Return the last tag read by a multi request read
|
||||
|
||||
:return: A tuple (tag name, value, type)
|
||||
"""
|
||||
return self._last_tag_read
|
||||
|
||||
def get_last_tag_write(self):
|
||||
""" Return the last tag write by a multi request write
|
||||
|
||||
:return: A tuple (tag name, 'GOOD') if the write was successful otherwise (tag name, 'BAD')
|
||||
"""
|
||||
return self._last_tag_write
|
||||
|
||||
def _parse_instance_attribute_list(self, start_tag_ptr, status):
|
||||
""" extract the tags list from the message received
|
||||
|
||||
:param start_tag_ptr: The point in the message string where the tag list begin
|
||||
:param status: The status of the message receives
|
||||
"""
|
||||
tags_returned = self._reply[start_tag_ptr:]
|
||||
tags_returned_length = len(tags_returned)
|
||||
idx = 0
|
||||
instance = 0
|
||||
count = 0
|
||||
try:
|
||||
while idx < tags_returned_length:
|
||||
instance = unpack_dint(tags_returned[idx:idx+4])
|
||||
idx += 4
|
||||
tag_length = unpack_uint(tags_returned[idx:idx+2])
|
||||
idx += 2
|
||||
tag_name = tags_returned[idx:idx+tag_length]
|
||||
idx += tag_length
|
||||
symbol_type = unpack_uint(tags_returned[idx:idx+2])
|
||||
idx += 2
|
||||
count += 1
|
||||
self._tag_list.append({'instance_id': instance,
|
||||
'tag_name': tag_name,
|
||||
'symbol_type': symbol_type})
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
if status == SUCCESS:
|
||||
self._last_instance = -1
|
||||
elif status == 0x06:
|
||||
self._last_instance = instance + 1
|
||||
else:
|
||||
self._status = (1, 'unknown status during _parse_tag_list')
|
||||
self._last_instance = -1
|
||||
|
||||
def _parse_structure_makeup_attributes(self, start_tag_ptr, status):
|
||||
""" extract the tags list from the message received
|
||||
|
||||
:param start_tag_ptr: The point in the message string where the tag list begin
|
||||
:param status: The status of the message receives
|
||||
"""
|
||||
self._buffer = {}
|
||||
|
||||
if status != SUCCESS:
|
||||
self._buffer['Error'] = status
|
||||
return
|
||||
|
||||
attribute = self._reply[start_tag_ptr:]
|
||||
idx = 4
|
||||
try:
|
||||
if unpack_uint(attribute[idx:idx + 2]) == SUCCESS:
|
||||
idx += 2
|
||||
self._buffer['object_definition_size'] = unpack_dint(attribute[idx:idx + 4])
|
||||
else:
|
||||
self._buffer['Error'] = 'object_definition Error'
|
||||
return
|
||||
|
||||
idx += 6
|
||||
if unpack_uint(attribute[idx:idx + 2]) == SUCCESS:
|
||||
idx += 2
|
||||
self._buffer['structure_size'] = unpack_dint(attribute[idx:idx + 4])
|
||||
else:
|
||||
self._buffer['Error'] = 'structure Error'
|
||||
return
|
||||
|
||||
idx += 6
|
||||
if unpack_uint(attribute[idx:idx + 2]) == SUCCESS:
|
||||
idx += 2
|
||||
self._buffer['member_count'] = unpack_uint(attribute[idx:idx + 2])
|
||||
else:
|
||||
self._buffer['Error'] = 'member_count Error'
|
||||
return
|
||||
|
||||
idx += 4
|
||||
if unpack_uint(attribute[idx:idx + 2]) == SUCCESS:
|
||||
idx += 2
|
||||
self._buffer['structure_handle'] = unpack_uint(attribute[idx:idx + 2])
|
||||
else:
|
||||
self._buffer['Error'] = 'structure_handle Error'
|
||||
return
|
||||
|
||||
return self._buffer
|
||||
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _parse_template(self, start_tag_ptr, status):
|
||||
""" extract the tags list from the message received
|
||||
|
||||
:param start_tag_ptr: The point in the message string where the tag list begin
|
||||
:param status: The status of the message receives
|
||||
"""
|
||||
tags_returned = self._reply[start_tag_ptr:]
|
||||
bytes_received = len(tags_returned)
|
||||
|
||||
self._buffer += tags_returned
|
||||
|
||||
if status == SUCCESS:
|
||||
self._get_template_in_progress = False
|
||||
|
||||
elif status == 0x06:
|
||||
self._byte_offset += bytes_received
|
||||
else:
|
||||
self._status = (1, 'unknown status {0} during _parse_template'.format(status))
|
||||
logger.warning(self._status)
|
||||
self._last_instance = -1
|
||||
|
||||
def _parse_fragment(self, start_ptr, status):
|
||||
""" parse the fragment returned by a fragment service.
|
||||
|
||||
:param start_ptr: Where the fragment start within the replay
|
||||
:param status: status field used to decide if keep parsing or stop
|
||||
"""
|
||||
try:
|
||||
data_type = unpack_uint(self._reply[start_ptr:start_ptr+2])
|
||||
fragment_returned = self._reply[start_ptr+2:]
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
fragment_returned_length = len(fragment_returned)
|
||||
idx = 0
|
||||
|
||||
while idx < fragment_returned_length:
|
||||
try:
|
||||
typ = I_DATA_TYPE[data_type]
|
||||
if self._output_raw:
|
||||
value = fragment_returned[idx:idx+DATA_FUNCTION_SIZE[typ]]
|
||||
else:
|
||||
value = UNPACK_DATA_FUNCTION[typ](fragment_returned[idx:idx+DATA_FUNCTION_SIZE[typ]])
|
||||
idx += DATA_FUNCTION_SIZE[typ]
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
if self._output_raw:
|
||||
self._tag_list += value
|
||||
else:
|
||||
self._tag_list.append((self._last_position, value))
|
||||
self._last_position += 1
|
||||
|
||||
if status == SUCCESS:
|
||||
self._byte_offset = -1
|
||||
elif status == 0x06:
|
||||
self._byte_offset += fragment_returned_length
|
||||
else:
|
||||
self._status = (2, 'unknown status during _parse_fragment')
|
||||
self._byte_offset = -1
|
||||
|
||||
def _parse_multiple_request_read(self, tags):
|
||||
""" parse the message received from a multi request read:
|
||||
|
||||
For each tag parsed, the information extracted includes the tag name, the value read and the data type.
|
||||
Those information are appended to the tag list as tuple
|
||||
|
||||
:return: the tag list
|
||||
"""
|
||||
offset = 50
|
||||
position = 50
|
||||
try:
|
||||
number_of_service_replies = unpack_uint(self._reply[offset:offset+2])
|
||||
tag_list = []
|
||||
for index in range(number_of_service_replies):
|
||||
position += 2
|
||||
start = offset + unpack_uint(self._reply[position:position+2])
|
||||
general_status = unpack_usint(self._reply[start+2:start+3])
|
||||
|
||||
if general_status == 0:
|
||||
data_type = unpack_uint(self._reply[start+4:start+6])
|
||||
value_begin = start + 6
|
||||
value_end = value_begin + DATA_FUNCTION_SIZE[I_DATA_TYPE[data_type]]
|
||||
value = self._reply[value_begin:value_end]
|
||||
self._last_tag_read = (tags[index], UNPACK_DATA_FUNCTION[I_DATA_TYPE[data_type]](value),
|
||||
I_DATA_TYPE[data_type])
|
||||
else:
|
||||
self._last_tag_read = (tags[index], None, None)
|
||||
|
||||
tag_list.append(self._last_tag_read)
|
||||
|
||||
return tag_list
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _parse_multiple_request_write(self, tags):
|
||||
""" parse the message received from a multi request writ:
|
||||
|
||||
For each tag parsed, the information extracted includes the tag name and the status of the writing.
|
||||
Those information are appended to the tag list as tuple
|
||||
|
||||
:return: the tag list
|
||||
"""
|
||||
offset = 50
|
||||
position = 50
|
||||
try:
|
||||
number_of_service_replies = unpack_uint(self._reply[offset:offset+2])
|
||||
tag_list = []
|
||||
for index in range(number_of_service_replies):
|
||||
position += 2
|
||||
start = offset + unpack_uint(self._reply[position:position+2])
|
||||
general_status = unpack_usint(self._reply[start+2:start+3])
|
||||
|
||||
if general_status == 0:
|
||||
self._last_tag_write = (tags[index] + ('GOOD',))
|
||||
else:
|
||||
self._last_tag_write = (tags[index] + ('BAD',))
|
||||
|
||||
tag_list.append(self._last_tag_write)
|
||||
return tag_list
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _check_reply(self):
|
||||
""" check the replayed message for error
|
||||
|
||||
"""
|
||||
self._more_packets_available = False
|
||||
try:
|
||||
if self._reply is None:
|
||||
self._status = (3, '%s without reply' % REPLAY_INFO[unpack_dint(self._message[:2])])
|
||||
return False
|
||||
# Get the type of command
|
||||
typ = unpack_uint(self._reply[:2])
|
||||
|
||||
# Encapsulation status check
|
||||
if unpack_dint(self._reply[8:12]) != SUCCESS:
|
||||
self._status = (3, "{0} reply status:{1}".format(REPLAY_INFO[typ],
|
||||
SERVICE_STATUS[unpack_dint(self._reply[8:12])]))
|
||||
return False
|
||||
|
||||
# Command Specific Status check
|
||||
if typ == unpack_uint(ENCAPSULATION_COMMAND["send_rr_data"]):
|
||||
status = unpack_usint(self._reply[42:43])
|
||||
if status != SUCCESS:
|
||||
self._status = (3, "send_rr_data reply:{0} - Extend status:{1}".format(
|
||||
SERVICE_STATUS[status], get_extended_status(self._reply, 42)))
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
elif typ == unpack_uint(ENCAPSULATION_COMMAND["send_unit_data"]):
|
||||
status = unpack_usint(self._reply[48:49])
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Read Tag Fragmented"]:
|
||||
self._parse_fragment(50, status)
|
||||
return True
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Get Instance Attributes List"]:
|
||||
self._parse_instance_attribute_list(50, status)
|
||||
return True
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Get Attributes"]:
|
||||
self._parse_structure_makeup_attributes(50, status)
|
||||
return True
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Read Template"] and \
|
||||
self._get_template_in_progress:
|
||||
self._parse_template(50, status)
|
||||
return True
|
||||
if status == 0x06:
|
||||
self._status = (3, "Insufficient Packet Space")
|
||||
self._more_packets_available = True
|
||||
elif status != SUCCESS:
|
||||
self._status = (3, "send_unit_data reply:{0} - Extend status:{1}".format(
|
||||
SERVICE_STATUS[status], get_extended_status(self._reply, 48)))
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def read_tag(self, tag):
|
||||
""" read tag from a connected plc
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
- ('Counts') a single tag name
|
||||
- (['ControlWord']) a list with one tag or many
|
||||
- (['parts', 'ControlWord', 'Counts'])
|
||||
|
||||
At the moment there is not a strong validation for the argument passed. The user should verify
|
||||
the correctness of the format passed.
|
||||
|
||||
:return: None is returned in case of error otherwise the tag list is returned
|
||||
"""
|
||||
multi_requests = False
|
||||
if isinstance(tag, list):
|
||||
multi_requests = True
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (6, "Target did not connected. read_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. read_tag will not be executed.")
|
||||
|
||||
if multi_requests:
|
||||
rp_list = []
|
||||
for t in tag:
|
||||
rp = create_tag_rp(t, multi_requests=True)
|
||||
if rp is None:
|
||||
self._status = (6, "Cannot create tag {0} request packet. read_tag will not be executed.".format(tag))
|
||||
raise DataError("Cannot create tag {0} request packet. read_tag will not be executed.".format(tag))
|
||||
else:
|
||||
rp_list.append(chr(TAG_SERVICES_REQUEST['Read Tag']) + rp + pack_uint(1))
|
||||
message_request = build_multiple_service(rp_list, Base._get_sequence())
|
||||
|
||||
else:
|
||||
rp = create_tag_rp(tag)
|
||||
if rp is None:
|
||||
self._status = (6, "Cannot create tag {0} request packet. read_tag will not be executed.".format(tag))
|
||||
return None
|
||||
else:
|
||||
# Creating the Message Request Packet
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST['Read Tag']), # the Request Service
|
||||
chr(len(rp) / 2), # the Request Path Size length in word
|
||||
rp, # the request path
|
||||
pack_uint(1)
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
if multi_requests:
|
||||
return self._parse_multiple_request_read(tag)
|
||||
else:
|
||||
# Get the data type
|
||||
data_type = unpack_uint(self._reply[50:52])
|
||||
try:
|
||||
return UNPACK_DATA_FUNCTION[I_DATA_TYPE[data_type]](self._reply[52:]), I_DATA_TYPE[data_type]
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def read_array(self, tag, counts, raw=False):
|
||||
""" read array of atomic data type from a connected plc
|
||||
|
||||
At the moment there is not a strong validation for the argument passed. The user should verify
|
||||
the correctness of the format passed.
|
||||
|
||||
:param tag: the name of the tag to read
|
||||
:param counts: the number of element to read
|
||||
:param raw: the value should output as raw-value (hex)
|
||||
:return: None is returned in case of error otherwise the tag list is returned
|
||||
"""
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (7, "Target did not connected. read_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. read_tag will not be executed.")
|
||||
|
||||
self._byte_offset = 0
|
||||
self._last_position = 0
|
||||
self._output_raw = raw
|
||||
|
||||
if self._output_raw:
|
||||
self._tag_list = ''
|
||||
else:
|
||||
self._tag_list = []
|
||||
while self._byte_offset != -1:
|
||||
rp = create_tag_rp(tag)
|
||||
if rp is None:
|
||||
self._status = (7, "Cannot create tag {0} request packet. read_tag will not be executed.".format(tag))
|
||||
return None
|
||||
else:
|
||||
# Creating the Message Request Packet
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST["Read Tag Fragmented"]), # the Request Service
|
||||
chr(len(rp) / 2), # the Request Path Size length in word
|
||||
rp, # the request path
|
||||
pack_uint(counts),
|
||||
pack_dint(self._byte_offset)
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
return self._tag_list
|
||||
|
||||
def write_tag(self, tag, value=None, typ=None):
|
||||
""" write tag/tags from a connected plc
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
- ('tag name', Value, data type) as single parameters or inside a tuple
|
||||
- ([('tag name', Value, data type), ('tag name2', Value, data type)]) as array of tuples
|
||||
|
||||
At the moment there is not a strong validation for the argument passed. The user should verify
|
||||
the correctness of the format passed.
|
||||
|
||||
The type accepted are:
|
||||
- BOOL
|
||||
- SINT
|
||||
- INT'
|
||||
- DINT
|
||||
- REAL
|
||||
- LINT
|
||||
- BYTE
|
||||
- WORD
|
||||
- DWORD
|
||||
- LWORD
|
||||
|
||||
:param tag: tag name, or an array of tuple containing (tag name, value, data type)
|
||||
:param value: the value to write or none if tag is an array of tuple or a tuple
|
||||
:param typ: the type of the tag to write or none if tag is an array of tuple or a tuple
|
||||
:return: None is returned in case of error otherwise the tag list is returned
|
||||
"""
|
||||
multi_requests = False
|
||||
if isinstance(tag, list):
|
||||
multi_requests = True
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (8, "Target did not connected. write_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. write_tag will not be executed.")
|
||||
|
||||
if multi_requests:
|
||||
rp_list = []
|
||||
tag_to_remove = []
|
||||
idx = 0
|
||||
for name, value, typ in tag:
|
||||
# Create the request path to wrap the tag name
|
||||
rp = create_tag_rp(name, multi_requests=True)
|
||||
if rp is None:
|
||||
self._status = (8, "Cannot create tag{0} req. packet. write_tag will not be executed".format(tag))
|
||||
return None
|
||||
else:
|
||||
try: # Trying to add the rp to the request path list
|
||||
val = PACK_DATA_FUNCTION[typ](value)
|
||||
rp_list.append(
|
||||
chr(TAG_SERVICES_REQUEST['Write Tag'])
|
||||
+ rp
|
||||
+ pack_uint(S_DATA_TYPE[typ])
|
||||
+ pack_uint(1)
|
||||
+ val
|
||||
)
|
||||
idx += 1
|
||||
except (LookupError, struct.error) as e:
|
||||
self._status = (8, "Tag:{0} type:{1} removed from write list. Error:{2}.".format(name, typ, e))
|
||||
|
||||
# The tag in idx position need to be removed from the rp list because has some kind of error
|
||||
tag_to_remove.append(idx)
|
||||
|
||||
# Remove the tags that have not been inserted in the request path list
|
||||
for position in tag_to_remove:
|
||||
del tag[position]
|
||||
# Create the message request
|
||||
message_request = build_multiple_service(rp_list, Base._get_sequence())
|
||||
|
||||
else:
|
||||
if isinstance(tag, tuple):
|
||||
name, value, typ = tag
|
||||
else:
|
||||
name = tag
|
||||
|
||||
rp = create_tag_rp(name)
|
||||
if rp is None:
|
||||
self._status = (8, "Cannot create tag {0} request packet. write_tag will not be executed.".format(tag))
|
||||
logger.warning(self._status)
|
||||
return None
|
||||
else:
|
||||
# Creating the Message Request Packet
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST["Write Tag"]), # the Request Service
|
||||
chr(len(rp) / 2), # the Request Path Size length in word
|
||||
rp, # the request path
|
||||
pack_uint(S_DATA_TYPE[typ]), # data type
|
||||
pack_uint(1), # Add the number of tag to write
|
||||
PACK_DATA_FUNCTION[typ](value)
|
||||
]
|
||||
|
||||
ret_val = self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)
|
||||
)
|
||||
|
||||
if multi_requests:
|
||||
return self._parse_multiple_request_write(tag)
|
||||
else:
|
||||
if ret_val is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
return ret_val
|
||||
|
||||
def write_array(self, tag, data_type, values, raw=False):
|
||||
""" write array of atomic data type from a connected plc
|
||||
|
||||
At the moment there is not a strong validation for the argument passed. The user should verify
|
||||
the correctness of the format passed.
|
||||
|
||||
:param tag: the name of the tag to read
|
||||
:param data_type: the type of tag to write
|
||||
:param values: the array of values to write, if raw: the frame with bytes
|
||||
:param raw: indicates that the values are given as raw values (hex)
|
||||
"""
|
||||
if not isinstance(values, list):
|
||||
self._status = (9, "A list of tags must be passed to write_array.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("A list of tags must be passed to write_array.")
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (9, "Target did not connected. write_array will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. write_array will not be executed.")
|
||||
|
||||
array_of_values = ""
|
||||
byte_size = 0
|
||||
byte_offset = 0
|
||||
|
||||
for i, value in enumerate(values):
|
||||
if raw:
|
||||
array_of_values += value
|
||||
else:
|
||||
array_of_values += PACK_DATA_FUNCTION[data_type](value)
|
||||
byte_size += DATA_FUNCTION_SIZE[data_type]
|
||||
|
||||
if byte_size >= 450 or i == len(values)-1:
|
||||
# create the message and send the fragment
|
||||
rp = create_tag_rp(tag)
|
||||
if rp is None:
|
||||
self._status = (9, "Cannot create tag {0} request packet. \
|
||||
write_array will not be executed.".format(tag))
|
||||
return None
|
||||
else:
|
||||
# Creating the Message Request Packet
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST["Write Tag Fragmented"]), # the Request Service
|
||||
chr(len(rp) / 2), # the Request Path Size length in word
|
||||
rp, # the request path
|
||||
pack_uint(S_DATA_TYPE[data_type]), # Data type to write
|
||||
pack_uint(len(values)), # Number of elements to write
|
||||
pack_dint(byte_offset),
|
||||
array_of_values # Fragment of elements to write
|
||||
]
|
||||
byte_offset += byte_size
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
array_of_values = ""
|
||||
byte_size = 0
|
||||
|
||||
def _get_instance_attribute_list_service(self):
|
||||
""" Step 1: Finding user-created controller scope tags in a Logix5000 controller
|
||||
|
||||
This service returns instance IDs for each created instance of the symbol class, along with a list
|
||||
of the attribute data associated with the requested attribute
|
||||
"""
|
||||
try:
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (10, "Target did not connected. get_tag_list will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. get_tag_list will not be executed.")
|
||||
|
||||
self._last_instance = 0
|
||||
|
||||
self._get_template_in_progress = True
|
||||
while self._last_instance != -1:
|
||||
|
||||
# Creating the Message Request Packet
|
||||
|
||||
message_request = [
|
||||
pack_uint(Base._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST['Get Instance Attributes List']), # STEP 1
|
||||
# the Request Path Size length in word
|
||||
chr(3),
|
||||
# Request Path ( 20 6B 25 00 Instance )
|
||||
CLASS_ID["8-bit"], # Class id = 20 from spec 0x20
|
||||
CLASS_CODE["Symbol Object"], # Logical segment: Symbolic Object 0x6B
|
||||
INSTANCE_ID["16-bit"], # Instance Segment: 16 Bit instance 0x25
|
||||
'\x00',
|
||||
pack_uint(self._last_instance), # The instance
|
||||
# Request Data
|
||||
pack_uint(2), # Number of attributes to retrieve
|
||||
pack_uint(1), # Attribute 1: Symbol name
|
||||
pack_uint(2) # Attribute 2: Symbol type
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,
|
||||
)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
self._get_template_in_progress = False
|
||||
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _get_structure_makeup(self, instance_id):
|
||||
"""
|
||||
get the structure makeup for a specific structure
|
||||
"""
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (10, "Target did not connected. get_tag_list will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. get_tag_list will not be executed.")
|
||||
|
||||
message_request = [
|
||||
pack_uint(self._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST['Get Attributes']),
|
||||
chr(3), # Request Path ( 20 6B 25 00 Instance )
|
||||
CLASS_ID["8-bit"], # Class id = 20 from spec 0x20
|
||||
CLASS_CODE["Template Object"], # Logical segment: Template Object 0x6C
|
||||
INSTANCE_ID["16-bit"], # Instance Segment: 16 Bit instance 0x25
|
||||
'\x00',
|
||||
pack_uint(instance_id),
|
||||
pack_uint(4), # Number of attributes
|
||||
pack_uint(4), # Template Object Definition Size UDINT
|
||||
pack_uint(5), # Template Structure Size UDINT
|
||||
pack_uint(2), # Template Member Count UINT
|
||||
pack_uint(1) # Structure Handle We can use this to read and write UINT
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(DATA_ITEM['Connected'],
|
||||
''.join(message_request), ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)) is None:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
return self._buffer
|
||||
|
||||
def _read_template(self, instance_id, object_definition_size):
|
||||
""" get a list of the tags in the plc
|
||||
|
||||
"""
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (10, "Target did not connected. get_tag_list will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. get_tag_list will not be executed.")
|
||||
|
||||
self._byte_offset = 0
|
||||
self._buffer = ""
|
||||
self._get_template_in_progress = True
|
||||
|
||||
try:
|
||||
while self._get_template_in_progress:
|
||||
|
||||
# Creating the Message Request Packet
|
||||
|
||||
message_request = [
|
||||
pack_uint(self._get_sequence()),
|
||||
chr(TAG_SERVICES_REQUEST['Read Template']),
|
||||
chr(3), # Request Path ( 20 6B 25 00 Instance )
|
||||
CLASS_ID["8-bit"], # Class id = 20 from spec 0x20
|
||||
CLASS_CODE["Template Object"], # Logical segment: Template Object 0x6C
|
||||
INSTANCE_ID["16-bit"], # Instance Segment: 16 Bit instance 0x25
|
||||
'\x00',
|
||||
pack_uint(instance_id),
|
||||
pack_dint(self._byte_offset), # Offset
|
||||
pack_uint(((object_definition_size * 4)-23) - self._byte_offset)
|
||||
]
|
||||
|
||||
if not self.send_unit_data(
|
||||
build_common_packet_format(DATA_ITEM['Connected'], ''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'], addr_data=self._target_cid,)):
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
self._get_template_in_progress = False
|
||||
return self._buffer
|
||||
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _isolating_user_tag(self):
|
||||
try:
|
||||
lst = self._tag_list
|
||||
self._tag_list = []
|
||||
for tag in lst:
|
||||
if tag['tag_name'].find(':') != -1 or tag['tag_name'].find('__') != -1:
|
||||
continue
|
||||
if tag['symbol_type'] & 0b0001000000000000:
|
||||
continue
|
||||
dimension = (tag['symbol_type'] & 0b0110000000000000) >> 13
|
||||
|
||||
if tag['symbol_type'] & 0b1000000000000000 :
|
||||
template_instance_id = tag['symbol_type'] & 0b0000111111111111
|
||||
tag_type = 'struct'
|
||||
data_type = 'user-created'
|
||||
self._tag_list.append({'instance_id': tag['instance_id'],
|
||||
'template_instance_id': template_instance_id,
|
||||
'tag_name': tag['tag_name'],
|
||||
'dim': dimension,
|
||||
'tag_type': tag_type,
|
||||
'data_type': data_type,
|
||||
'template': {},
|
||||
'udt': {}})
|
||||
else:
|
||||
tag_type = 'atomic'
|
||||
datatype = tag['symbol_type'] & 0b0000000011111111
|
||||
data_type = I_DATA_TYPE[datatype]
|
||||
if datatype == 0xc1:
|
||||
bit_position = (tag['symbol_type'] & 0b0000011100000000) >> 8
|
||||
self._tag_list.append({'instance_id': tag['instance_id'],
|
||||
'tag_name': tag['tag_name'],
|
||||
'dim': dimension,
|
||||
'tag_type': tag_type,
|
||||
'data_type': data_type,
|
||||
'bit_position' : bit_position})
|
||||
else:
|
||||
self._tag_list.append({'instance_id': tag['instance_id'],
|
||||
'tag_name': tag['tag_name'],
|
||||
'dim': dimension,
|
||||
'tag_type': tag_type,
|
||||
'data_type': data_type})
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def _parse_udt_raw(self, tag):
|
||||
try:
|
||||
buff = self._read_template(tag['template_instance_id'], tag['template']['object_definition_size'])
|
||||
member_count = tag['template']['member_count']
|
||||
names = buff.split('\00')
|
||||
lst = []
|
||||
|
||||
tag['udt']['name'] = 'Not an user defined structure'
|
||||
for name in names:
|
||||
if len(name) > 1:
|
||||
|
||||
if name.find(';') != -1:
|
||||
tag['udt']['name'] = name[:name.find(';')]
|
||||
elif name.find('ZZZZZZZZZZ') != -1:
|
||||
continue
|
||||
elif name.isalpha():
|
||||
lst.append(name)
|
||||
else:
|
||||
continue
|
||||
tag['udt']['internal_tags'] = lst
|
||||
|
||||
type_list = []
|
||||
|
||||
for i in xrange(member_count):
|
||||
# skip member 1
|
||||
|
||||
if i != 0:
|
||||
array_size = unpack_uint(buff[:2])
|
||||
try:
|
||||
data_type = I_DATA_TYPE[unpack_uint(buff[2:4])]
|
||||
except Exception:
|
||||
data_type = "None"
|
||||
|
||||
offset = unpack_dint(buff[4:8])
|
||||
type_list.append((array_size, data_type, offset))
|
||||
|
||||
buff = buff[8:]
|
||||
|
||||
tag['udt']['data_type'] = type_list
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def get_tag_list(self):
|
||||
self._tag_list = []
|
||||
# Step 1
|
||||
self._get_instance_attribute_list_service()
|
||||
|
||||
# Step 2
|
||||
self._isolating_user_tag()
|
||||
|
||||
# Step 3
|
||||
for tag in self._tag_list:
|
||||
if tag['tag_type'] == 'struct':
|
||||
tag['template'] = self._get_structure_makeup(tag['template_instance_id'])
|
||||
|
||||
for idx, tag in enumerate(self._tag_list):
|
||||
# print (tag)
|
||||
if tag['tag_type'] == 'struct':
|
||||
self._parse_udt_raw(tag)
|
||||
|
||||
# Step 4
|
||||
|
||||
return self._tag_list
|
||||
574
daq_sample/pycomm-master/pycomm/ab_comm/slc.py
Executable file
@@ -0,0 +1,574 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# clx.py - Ethernet/IP Client for Rockwell PLCs
|
||||
#
|
||||
#
|
||||
# Copyright (c) 2014 Agostino Ruscito <ruscito@gmail.com>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in all
|
||||
# copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
# SOFTWARE.
|
||||
#
|
||||
from pycomm.cip.cip_base import *
|
||||
import re
|
||||
import math
|
||||
#import binascii
|
||||
|
||||
import logging
|
||||
try: # Python 2.7+
|
||||
from logging import NullHandler
|
||||
except ImportError:
|
||||
class NullHandler(logging.Handler):
|
||||
def emit(self, record):
|
||||
pass
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.addHandler(NullHandler())
|
||||
|
||||
|
||||
def parse_tag(tag):
|
||||
t = re.search(r"(?P<file_type>[CT])(?P<file_number>\d{1,3})"
|
||||
r"(:)(?P<element_number>\d{1,3})"
|
||||
r"(.)(?P<sub_element>ACC|PRE|EN|DN|TT|CU|CD|DN|OV|UN|UA)", tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if (1 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255):
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': PCCC_CT[t.group('sub_element').upper()],
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
|
||||
t = re.search(r"(?P<file_type>[LFBN])(?P<file_number>\d{1,3})"
|
||||
r"(:)(?P<element_number>\d{1,3})"
|
||||
r"(/(?P<sub_element>\d{1,2}))?",
|
||||
tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if t.group('sub_element') is not None:
|
||||
if (1 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255) \
|
||||
and (0 <= int(t.group('sub_element')) <= 15):
|
||||
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': t.group('sub_element'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
else:
|
||||
if (1 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255):
|
||||
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': t.group('sub_element'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 2}
|
||||
|
||||
t = re.search(r"(?P<file_type>[IO])(:)(?P<file_number>\d{1,3})"
|
||||
r"(.)(?P<element_number>\d{1,3})"
|
||||
r"(/(?P<sub_element>\d{1,2}))?", tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if t.group('sub_element') is not None:
|
||||
if (0 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255) \
|
||||
and (0 <= int(t.group('sub_element')) <= 15):
|
||||
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': t.group('sub_element'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
else:
|
||||
if (0 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 255):
|
||||
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': t.group('element_number'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 2}
|
||||
|
||||
t = re.search(r"(?P<file_type>S)"
|
||||
r"(:)(?P<element_number>\d{1,3})"
|
||||
r"(/(?P<sub_element>\d{1,2}))?", tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if t.group('sub_element') is not None:
|
||||
if (0 <= int(t.group('element_number')) <= 255) \
|
||||
and (0 <= int(t.group('sub_element')) <= 15):
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': '2',
|
||||
'element_number': t.group('element_number'),
|
||||
'sub_element': t.group('sub_element'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
else:
|
||||
if 0 <= int(t.group('element_number')) <= 255:
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': '2',
|
||||
'element_number': t.group('element_number'),
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 2}
|
||||
|
||||
t = re.search(r"(?P<file_type>B)(?P<file_number>\d{1,3})"
|
||||
r"(/)(?P<element_number>\d{1,4})",
|
||||
tag, flags=re.IGNORECASE)
|
||||
if t:
|
||||
if (1 <= int(t.group('file_number')) <= 255) \
|
||||
and (0 <= int(t.group('element_number')) <= 4095):
|
||||
bit_position = int(t.group('element_number'))
|
||||
element_number = bit_position / 16
|
||||
sub_element = bit_position - (element_number * 16)
|
||||
return True, t.group(0), {'file_type': t.group('file_type').upper(),
|
||||
'file_number': t.group('file_number'),
|
||||
'element_number': element_number,
|
||||
'sub_element': sub_element,
|
||||
'read_func': '\xa2',
|
||||
'write_func': '\xab',
|
||||
'address_field': 3}
|
||||
|
||||
return False, tag
|
||||
|
||||
|
||||
class Driver(Base):
|
||||
"""
|
||||
SLC/PLC_5 Implementation
|
||||
"""
|
||||
def __init__(self):
|
||||
super(Driver, self).__init__()
|
||||
|
||||
self.__version__ = '0.1'
|
||||
self._last_sequence = 0
|
||||
|
||||
def _check_reply(self):
|
||||
"""
|
||||
check the replayed message for error
|
||||
"""
|
||||
self._more_packets_available = False
|
||||
try:
|
||||
if self._reply is None:
|
||||
self._status = (3, '%s without reply' % REPLAY_INFO[unpack_dint(self._message[:2])])
|
||||
return False
|
||||
# Get the type of command
|
||||
typ = unpack_uint(self._reply[:2])
|
||||
|
||||
# Encapsulation status check
|
||||
if unpack_dint(self._reply[8:12]) != SUCCESS:
|
||||
self._status = (3, "{0} reply status:{1}".format(REPLAY_INFO[typ],
|
||||
SERVICE_STATUS[unpack_dint(self._reply[8:12])]))
|
||||
return False
|
||||
|
||||
# Command Specific Status check
|
||||
if typ == unpack_uint(ENCAPSULATION_COMMAND["send_rr_data"]):
|
||||
status = unpack_usint(self._reply[42:43])
|
||||
if status != SUCCESS:
|
||||
self._status = (3, "send_rr_data reply:{0} - Extend status:{1}".format(
|
||||
SERVICE_STATUS[status], get_extended_status(self._reply, 42)))
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
elif typ == unpack_uint(ENCAPSULATION_COMMAND["send_unit_data"]):
|
||||
status = unpack_usint(self._reply[48:49])
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Read Tag Fragmented"]:
|
||||
self._parse_fragment(50, status)
|
||||
return True
|
||||
if unpack_usint(self._reply[46:47]) == I_TAG_SERVICES_REPLY["Get Instance Attributes List"]:
|
||||
self._parse_tag_list(50, status)
|
||||
return True
|
||||
if status == 0x06:
|
||||
self._status = (3, "Insufficient Packet Space")
|
||||
self._more_packets_available = True
|
||||
elif status != SUCCESS:
|
||||
self._status = (3, "send_unit_data reply:{0} - Extend status:{1}".format(
|
||||
SERVICE_STATUS[status], get_extended_status(self._reply, 48)))
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
raise DataError(e)
|
||||
|
||||
def __queue_data_available(self, queue_number):
|
||||
""" read the queue
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
print c.read_tag('F8:0', 3) return a list of 3 registers starting from F8:0
|
||||
print c.read_tag('F8:0') return one value
|
||||
|
||||
It is possible to read status bit
|
||||
|
||||
:return: None is returned in case of error
|
||||
"""
|
||||
|
||||
# Creating the Message Request Packet
|
||||
self._last_sequence = pack_uint(Base._get_sequence())
|
||||
|
||||
# PCCC_Cmd_Rd_w3_Q2 = [0x0f, 0x00, 0x30, 0x00, 0xa2, 0x6d, 0x00, 0xa5, 0x02, 0x00]
|
||||
message_request = [
|
||||
self._last_sequence,
|
||||
'\x4b',
|
||||
'\x02',
|
||||
CLASS_ID["8-bit"],
|
||||
PATH["PCCC"],
|
||||
'\x07',
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
'\x0f',
|
||||
'\x00',
|
||||
self._last_sequence[1],
|
||||
self._last_sequence[0],
|
||||
'\xa2', # protected typed logical read with three address fields FNC
|
||||
'\x6d', # Byte size to read = 109
|
||||
'\x00', # File Number
|
||||
'\xa5', # File Type
|
||||
pack_uint(queue_number)
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)):
|
||||
|
||||
sts = int(unpack_uint(self._reply[2:4]))
|
||||
if sts == 146:
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
else:
|
||||
raise DataError("read_queue [send_unit_data] returned not valid data")
|
||||
|
||||
def __save_record(self, filename):
|
||||
with open(filename, "a") as csv_file:
|
||||
logger.debug("SLC __save_record read:{0}".format(self._reply[61:]))
|
||||
csv_file.write(self._reply[61:]+'\n')
|
||||
csv_file.close()
|
||||
|
||||
def __get_queue_size(self, queue_number):
|
||||
""" get queue size
|
||||
"""
|
||||
# Creating the Message Request Packet
|
||||
self._last_sequence = pack_uint(Base._get_sequence())
|
||||
|
||||
message_request = [
|
||||
self._last_sequence,
|
||||
'\x4b',
|
||||
'\x02',
|
||||
CLASS_ID["8-bit"],
|
||||
PATH["PCCC"],
|
||||
'\x07',
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
'\x0f',
|
||||
'\x00',
|
||||
self._last_sequence[1],
|
||||
self._last_sequence[0],
|
||||
# '\x30',
|
||||
# '\x00',
|
||||
'\xa1', # FNC to get the queue size
|
||||
'\x06', # Byte size to read = 06
|
||||
'\x00', # File Number
|
||||
'\xea', # File Type ????
|
||||
'\xff', # File Type ????
|
||||
pack_uint(queue_number)
|
||||
]
|
||||
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)):
|
||||
sts = int(unpack_uint(self._reply[65:67]))
|
||||
logger.debug("SLC __get_queue_size({0}) returned {1}".format(queue_number, sts))
|
||||
return sts
|
||||
else:
|
||||
raise DataError("read_queue [send_unit_data] returned not valid data")
|
||||
|
||||
def read_queue(self, queue_number, file_name):
|
||||
""" read the queue
|
||||
|
||||
"""
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (5, "Target did not connected. is_queue_available will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. is_queue_available will not be executed.")
|
||||
|
||||
if self.__queue_data_available(queue_number):
|
||||
logger.debug("SLC read_queue: Queue {0} has data".format(queue_number))
|
||||
self.__save_record(file_name + str(queue_number) + ".csv")
|
||||
size = self.__get_queue_size(queue_number)
|
||||
if size > 0:
|
||||
for i in range(0, size):
|
||||
if self.__queue_data_available(queue_number):
|
||||
self.__save_record(file_name + str(queue_number) + ".csv")
|
||||
|
||||
logger.debug("SLC read_queue: {0} record extract from queue {1}".format(size, queue_number))
|
||||
else:
|
||||
logger.debug("SLC read_queue: Queue {0} has no data".format(queue_number))
|
||||
|
||||
def read_tag(self, tag, n=1):
|
||||
""" read tag from a connected plc
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
print c.read_tag('F8:0', 3) return a list of 3 registers starting from F8:0
|
||||
print c.read_tag('F8:0') return one value
|
||||
|
||||
It is possible to read status bit
|
||||
|
||||
:return: None is returned in case of error
|
||||
"""
|
||||
res = parse_tag(tag)
|
||||
if not res[0]:
|
||||
self._status = (1000, "Error parsing the tag passed to read_tag({0},{1})".format(tag, n))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error parsing the tag passed to read_tag({0},{1})".format(tag, n))
|
||||
|
||||
bit_read = False
|
||||
bit_position = 0
|
||||
sub_element = 0
|
||||
if int(res[2]['address_field'] == 3):
|
||||
bit_read = True
|
||||
bit_position = int(res[2]['sub_element'])
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (5, "Target did not connected. read_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. read_tag will not be executed.")
|
||||
|
||||
data_size = PCCC_DATA_SIZE[res[2]['file_type']]
|
||||
|
||||
# Creating the Message Request Packet
|
||||
self._last_sequence = pack_uint(Base._get_sequence())
|
||||
|
||||
message_request = [
|
||||
self._last_sequence,
|
||||
'\x4b',
|
||||
'\x02',
|
||||
CLASS_ID["8-bit"],
|
||||
PATH["PCCC"],
|
||||
'\x07',
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
'\x0f',
|
||||
'\x00',
|
||||
self._last_sequence[1],
|
||||
self._last_sequence[0],
|
||||
res[2]['read_func'],
|
||||
pack_usint(data_size * n),
|
||||
pack_usint(int(res[2]['file_number'])),
|
||||
PCCC_DATA_TYPE[res[2]['file_type']],
|
||||
pack_usint(int(res[2]['element_number'])),
|
||||
pack_usint(sub_element)
|
||||
]
|
||||
|
||||
logger.debug("SLC read_tag({0},{1})".format(tag, n))
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request),
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)):
|
||||
sts = int(unpack_usint(self._reply[58]))
|
||||
try:
|
||||
if sts != 0:
|
||||
sts_txt = PCCC_ERROR_CODE[sts]
|
||||
self._status = (1000, "Error({0}) returned from read_tag({1},{2})".format(sts_txt, tag, n))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) returned from read_tag({1},{2})".format(sts_txt, tag, n))
|
||||
|
||||
new_value = 61
|
||||
if bit_read:
|
||||
if res[2]['file_type'] == 'T' or res[2]['file_type'] == 'C':
|
||||
if bit_position == PCCC_CT['PRE']:
|
||||
return UNPACK_PCCC_DATA_FUNCTION[res[2]['file_type']](
|
||||
self._reply[new_value+2:new_value+2+data_size])
|
||||
elif bit_position == PCCC_CT['ACC']:
|
||||
return UNPACK_PCCC_DATA_FUNCTION[res[2]['file_type']](
|
||||
self._reply[new_value+4:new_value+4+data_size])
|
||||
|
||||
tag_value = UNPACK_PCCC_DATA_FUNCTION[res[2]['file_type']](
|
||||
self._reply[new_value:new_value+data_size])
|
||||
return get_bit(tag_value, bit_position)
|
||||
|
||||
else:
|
||||
values_list = []
|
||||
while len(self._reply[new_value:]) >= data_size:
|
||||
values_list.append(
|
||||
UNPACK_PCCC_DATA_FUNCTION[res[2]['file_type']](self._reply[new_value:new_value+data_size])
|
||||
)
|
||||
new_value = new_value+data_size
|
||||
|
||||
if len(values_list) > 1:
|
||||
return values_list
|
||||
else:
|
||||
return values_list[0]
|
||||
|
||||
except Exception as e:
|
||||
self._status = (1000, "Error({0}) parsing the data returned from read_tag({1},{2})".format(e, tag, n))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) parsing the data returned from read_tag({1},{2})".format(e, tag, n))
|
||||
else:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
|
||||
def write_tag(self, tag, value):
|
||||
""" write tag from a connected plc
|
||||
|
||||
Possible combination can be passed to this method:
|
||||
c.write_tag('N7:0', [-30, 32767, -32767])
|
||||
c.write_tag('N7:0', 21)
|
||||
c.read_tag('N7:0', 10)
|
||||
|
||||
It is not possible to write status bit
|
||||
|
||||
:return: None is returned in case of error
|
||||
"""
|
||||
res = parse_tag(tag)
|
||||
if not res[0]:
|
||||
self._status = (1000, "Error parsing the tag passed to read_tag({0},{1})".format(tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error parsing the tag passed to read_tag({0},{1})".format(tag, value))
|
||||
|
||||
if isinstance(value, list) and int(res[2]['address_field'] == 3):
|
||||
self._status = (1000, "Function's parameters error. read_tag({0},{1})".format(tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Function's parameters error. read_tag({0},{1})".format(tag, value))
|
||||
|
||||
if isinstance(value, list) and int(res[2]['address_field'] == 3):
|
||||
self._status = (1000, "Function's parameters error. read_tag({0},{1})".format(tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Function's parameters error. read_tag({0},{1})".format(tag, value))
|
||||
|
||||
bit_field = False
|
||||
bit_position = 0
|
||||
sub_element = 0
|
||||
if int(res[2]['address_field'] == 3):
|
||||
bit_field = True
|
||||
bit_position = int(res[2]['sub_element'])
|
||||
values_list = ''
|
||||
else:
|
||||
values_list = '\xff\xff'
|
||||
|
||||
multi_requests = False
|
||||
if isinstance(value, list):
|
||||
multi_requests = True
|
||||
|
||||
if not self._target_is_connected:
|
||||
if not self.forward_open():
|
||||
self._status = (1000, "Target did not connected. write_tag will not be executed.")
|
||||
logger.warning(self._status)
|
||||
raise DataError("Target did not connected. write_tag will not be executed.")
|
||||
|
||||
try:
|
||||
n = 0
|
||||
if multi_requests:
|
||||
data_size = PCCC_DATA_SIZE[res[2]['file_type']]
|
||||
for v in value:
|
||||
values_list += PACK_PCCC_DATA_FUNCTION[res[2]['file_type']](v)
|
||||
n += 1
|
||||
else:
|
||||
n = 1
|
||||
if bit_field:
|
||||
data_size = 2
|
||||
|
||||
if (res[2]['file_type'] == 'T' or res[2]['file_type'] == 'C') \
|
||||
and (bit_position == PCCC_CT['PRE'] or bit_position == PCCC_CT['ACC']):
|
||||
sub_element = bit_position
|
||||
values_list = '\xff\xff' + PACK_PCCC_DATA_FUNCTION[res[2]['file_type']](value)
|
||||
else:
|
||||
sub_element = 0
|
||||
if value > 0:
|
||||
values_list = pack_uint(math.pow(2, bit_position)) + pack_uint(math.pow(2, bit_position))
|
||||
else:
|
||||
values_list = pack_uint(math.pow(2, bit_position)) + pack_uint(0)
|
||||
|
||||
else:
|
||||
values_list += PACK_PCCC_DATA_FUNCTION[res[2]['file_type']](value)
|
||||
data_size = PCCC_DATA_SIZE[res[2]['file_type']]
|
||||
|
||||
except Exception as e:
|
||||
self._status = (1000, "Error({0}) packing the values to write to the"
|
||||
"SLC write_tag({1},{2})".format(e, tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) packing the values to write to the "
|
||||
"SLC write_tag({1},{2})".format(e, tag, value))
|
||||
|
||||
data_to_write = values_list
|
||||
|
||||
# Creating the Message Request Packet
|
||||
self._last_sequence = pack_uint(Base._get_sequence())
|
||||
|
||||
message_request = [
|
||||
self._last_sequence,
|
||||
'\x4b',
|
||||
'\x02',
|
||||
CLASS_ID["8-bit"],
|
||||
PATH["PCCC"],
|
||||
'\x07',
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
'\x0f',
|
||||
'\x00',
|
||||
self._last_sequence[1],
|
||||
self._last_sequence[0],
|
||||
res[2]['write_func'],
|
||||
pack_usint(data_size * n),
|
||||
pack_usint(int(res[2]['file_number'])),
|
||||
PCCC_DATA_TYPE[res[2]['file_type']],
|
||||
pack_usint(int(res[2]['element_number'])),
|
||||
pack_usint(sub_element)
|
||||
]
|
||||
|
||||
logger.debug("SLC write_tag({0},{1})".format(tag, value))
|
||||
if self.send_unit_data(
|
||||
build_common_packet_format(
|
||||
DATA_ITEM['Connected'],
|
||||
''.join(message_request) + data_to_write,
|
||||
ADDRESS_ITEM['Connection Based'],
|
||||
addr_data=self._target_cid,)):
|
||||
sts = int(unpack_usint(self._reply[58]))
|
||||
try:
|
||||
if sts != 0:
|
||||
sts_txt = PCCC_ERROR_CODE[sts]
|
||||
self._status = (1000, "Error({0}) returned from SLC write_tag({1},{2})".format(sts_txt, tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) returned from SLC write_tag({1},{2})".format(sts_txt, tag, value))
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
self._status = (1000, "Error({0}) parsing the data returned from "
|
||||
"SLC write_tag({1},{2})".format(e, tag, value))
|
||||
logger.warning(self._status)
|
||||
raise DataError("Error({0}) parsing the data returned from "
|
||||
"SLC write_tag({1},{2})".format(e, tag, value))
|
||||
else:
|
||||
raise DataError("send_unit_data returned not valid data")
|
||||
1
daq_sample/pycomm-master/pycomm/cip/__init__.py
Executable file
@@ -0,0 +1 @@
|
||||
__author__ = 'agostino'
|
||||
864
daq_sample/pycomm-master/pycomm/cip/cip_base.py
Executable file
@@ -0,0 +1,864 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# cip_base.py - A set of classes methods and structures used to implement Ethernet/IP
|
||||
#
|
||||
#
|
||||
# Copyright (c) 2014 Agostino Ruscito <ruscito@gmail.com>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in all
|
||||
# copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
# SOFTWARE.
|
||||
#
|
||||
|
||||
import struct
|
||||
import socket
|
||||
import random
|
||||
|
||||
from os import getpid
|
||||
from pycomm.cip.cip_const import *
|
||||
from pycomm.common import PycommError
|
||||
|
||||
|
||||
import logging
|
||||
try: # Python 2.7+
|
||||
from logging import NullHandler
|
||||
except ImportError:
|
||||
class NullHandler(logging.Handler):
|
||||
def emit(self, record):
|
||||
pass
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.addHandler(NullHandler())
|
||||
|
||||
|
||||
class CommError(PycommError):
|
||||
pass
|
||||
|
||||
|
||||
class DataError(PycommError):
|
||||
pass
|
||||
|
||||
|
||||
def pack_sint(n):
|
||||
return struct.pack('b', n)
|
||||
|
||||
|
||||
def pack_usint(n):
|
||||
return struct.pack('B', n)
|
||||
|
||||
|
||||
def pack_int(n):
|
||||
"""pack 16 bit into 2 bytes little endian"""
|
||||
return struct.pack('<h', n)
|
||||
|
||||
|
||||
def pack_uint(n):
|
||||
"""pack 16 bit into 2 bytes little endian"""
|
||||
return struct.pack('<H', n)
|
||||
|
||||
|
||||
def pack_dint(n):
|
||||
"""pack 32 bit into 4 bytes little endian"""
|
||||
return struct.pack('<i', n)
|
||||
|
||||
|
||||
def pack_real(r):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return struct.pack('<f', r)
|
||||
|
||||
|
||||
def pack_lint(l):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return struct.unpack('<q', l)
|
||||
|
||||
|
||||
def unpack_bool(st):
|
||||
if not (int(struct.unpack('B', st[0])[0]) == 0):
|
||||
return 1
|
||||
return 0
|
||||
|
||||
|
||||
def unpack_sint(st):
|
||||
return int(struct.unpack('b', st[0])[0])
|
||||
|
||||
|
||||
def unpack_usint(st):
|
||||
return int(struct.unpack('B', st[0])[0])
|
||||
|
||||
|
||||
def unpack_int(st):
|
||||
"""unpack 2 bytes little endian to int"""
|
||||
return int(struct.unpack('<h', st[0:2])[0])
|
||||
|
||||
|
||||
def unpack_uint(st):
|
||||
"""unpack 2 bytes little endian to int"""
|
||||
return int(struct.unpack('<H', st[0:2])[0])
|
||||
|
||||
|
||||
def unpack_dint(st):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return int(struct.unpack('<i', st[0:4])[0])
|
||||
|
||||
|
||||
def unpack_real(st):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return float(struct.unpack('<f', st[0:4])[0])
|
||||
|
||||
|
||||
def unpack_lint(st):
|
||||
"""unpack 4 bytes little endian to int"""
|
||||
return int(struct.unpack('<q', st[0:8])[0])
|
||||
|
||||
|
||||
def get_bit(value, idx):
|
||||
""":returns value of bit at position idx"""
|
||||
return (value & (1 << idx)) != 0
|
||||
|
||||
|
||||
PACK_DATA_FUNCTION = {
|
||||
'BOOL': pack_sint,
|
||||
'SINT': pack_sint, # Signed 8-bit integer
|
||||
'INT': pack_int, # Signed 16-bit integer
|
||||
'UINT': pack_uint, # Unsigned 16-bit integer
|
||||
'USINT': pack_usint, # Unsigned Byte Integer
|
||||
'DINT': pack_dint, # Signed 32-bit integer
|
||||
'REAL': pack_real, # 32-bit floating point
|
||||
'LINT': pack_lint,
|
||||
'BYTE': pack_sint, # byte string 8-bits
|
||||
'WORD': pack_uint, # byte string 16-bits
|
||||
'DWORD': pack_dint, # byte string 32-bits
|
||||
'LWORD': pack_lint # byte string 64-bits
|
||||
}
|
||||
|
||||
|
||||
UNPACK_DATA_FUNCTION = {
|
||||
'BOOL': unpack_bool,
|
||||
'SINT': unpack_sint, # Signed 8-bit integer
|
||||
'INT': unpack_int, # Signed 16-bit integer
|
||||
'UINT': unpack_uint, # Unsigned 16-bit integer
|
||||
'USINT': unpack_usint, # Unsigned Byte Integer
|
||||
'DINT': unpack_dint, # Signed 32-bit integer
|
||||
'REAL': unpack_real, # 32-bit floating point,
|
||||
'LINT': unpack_lint,
|
||||
'BYTE': unpack_sint, # byte string 8-bits
|
||||
'WORD': unpack_uint, # byte string 16-bits
|
||||
'DWORD': unpack_dint, # byte string 32-bits
|
||||
'LWORD': unpack_lint # byte string 64-bits
|
||||
}
|
||||
|
||||
|
||||
DATA_FUNCTION_SIZE = {
|
||||
'BOOL': 1,
|
||||
'SINT': 1, # Signed 8-bit integer
|
||||
'USINT': 1, # Unisgned 8-bit integer
|
||||
'INT': 2, # Signed 16-bit integer
|
||||
'UINT': 2, # Unsigned 16-bit integer
|
||||
'DINT': 4, # Signed 32-bit integer
|
||||
'REAL': 4, # 32-bit floating point
|
||||
'LINT': 8,
|
||||
'BYTE': 1, # byte string 8-bits
|
||||
'WORD': 2, # byte string 16-bits
|
||||
'DWORD': 4, # byte string 32-bits
|
||||
'LWORD': 8 # byte string 64-bits
|
||||
}
|
||||
|
||||
UNPACK_PCCC_DATA_FUNCTION = {
|
||||
'N': unpack_int,
|
||||
'B': unpack_int,
|
||||
'T': unpack_int,
|
||||
'C': unpack_int,
|
||||
'S': unpack_int,
|
||||
'F': unpack_real,
|
||||
'A': unpack_sint,
|
||||
'R': unpack_dint,
|
||||
'O': unpack_int,
|
||||
'I': unpack_int
|
||||
}
|
||||
|
||||
PACK_PCCC_DATA_FUNCTION = {
|
||||
'N': pack_int,
|
||||
'B': pack_int,
|
||||
'T': pack_int,
|
||||
'C': pack_int,
|
||||
'S': pack_int,
|
||||
'F': pack_real,
|
||||
'A': pack_sint,
|
||||
'R': pack_dint,
|
||||
'O': pack_int,
|
||||
'I': pack_int
|
||||
}
|
||||
|
||||
def print_bytes_line(msg):
|
||||
out = ''
|
||||
for ch in msg:
|
||||
out += "{:0>2x}".format(ord(ch))
|
||||
return out
|
||||
|
||||
|
||||
def print_bytes_msg(msg, info=''):
|
||||
out = info
|
||||
new_line = True
|
||||
line = 0
|
||||
column = 0
|
||||
for idx, ch in enumerate(msg):
|
||||
if new_line:
|
||||
out += "\n({:0>4d}) ".format(line * 10)
|
||||
new_line = False
|
||||
out += "{:0>2x} ".format(ord(ch))
|
||||
if column == 9:
|
||||
new_line = True
|
||||
column = 0
|
||||
line += 1
|
||||
else:
|
||||
column += 1
|
||||
return out
|
||||
|
||||
|
||||
def get_extended_status(msg, start):
|
||||
status = unpack_usint(msg[start:start+1])
|
||||
# send_rr_data
|
||||
# 42 General Status
|
||||
# 43 Size of additional status
|
||||
# 44..n additional status
|
||||
|
||||
# send_unit_data
|
||||
# 48 General Status
|
||||
# 49 Size of additional status
|
||||
# 50..n additional status
|
||||
extended_status_size = (unpack_usint(msg[start+1:start+2]))*2
|
||||
extended_status = 0
|
||||
if extended_status_size != 0:
|
||||
# There is an additional status
|
||||
if extended_status_size == 1:
|
||||
extended_status = unpack_usint(msg[start+2:start+3])
|
||||
elif extended_status_size == 2:
|
||||
extended_status = unpack_uint(msg[start+2:start+4])
|
||||
elif extended_status_size == 4:
|
||||
extended_status = unpack_dint(msg[start+2:start+6])
|
||||
else:
|
||||
return 'Extended Status Size Unknown'
|
||||
try:
|
||||
return '{0}'.format(EXTEND_CODES[status][extended_status])
|
||||
except LookupError:
|
||||
return "Extended Status info not present"
|
||||
|
||||
|
||||
def create_tag_rp(tag, multi_requests=False):
|
||||
""" Create tag Request Packet
|
||||
|
||||
It returns the request packed wrapped around the tag passed.
|
||||
If any error it returns none
|
||||
"""
|
||||
tags = tag.split('.')
|
||||
rp = []
|
||||
index = []
|
||||
for tag in tags:
|
||||
add_index = False
|
||||
# Check if is an array tag
|
||||
if tag.find('[') != -1:
|
||||
# Remove the last square bracket
|
||||
tag = tag[:len(tag)-1]
|
||||
# Isolate the value inside bracket
|
||||
inside_value = tag[tag.find('[')+1:]
|
||||
# Now split the inside value in case part of multidimensional array
|
||||
index = inside_value.split(',')
|
||||
# Flag the existence of one o more index
|
||||
add_index = True
|
||||
# Get only the tag part
|
||||
tag = tag[:tag.find('[')]
|
||||
tag_length = len(tag)
|
||||
|
||||
# Create the request path
|
||||
rp.append(EXTENDED_SYMBOL) # ANSI Ext. symbolic segment
|
||||
rp.append(chr(tag_length)) # Length of the tag
|
||||
|
||||
# Add the tag to the Request path
|
||||
for char in tag:
|
||||
rp.append(char)
|
||||
# Add pad byte because total length of Request path must be word-aligned
|
||||
if tag_length % 2:
|
||||
rp.append(PADDING_BYTE)
|
||||
# Add any index
|
||||
if add_index:
|
||||
for idx in index:
|
||||
val = int(idx)
|
||||
if val <= 0xff:
|
||||
rp.append(ELEMENT_ID["8-bit"])
|
||||
rp.append(pack_usint(val))
|
||||
elif val <= 0xffff:
|
||||
rp.append(ELEMENT_ID["16-bit"]+PADDING_BYTE)
|
||||
rp.append(pack_uint(val))
|
||||
elif val <= 0xfffffffff:
|
||||
rp.append(ELEMENT_ID["32-bit"]+PADDING_BYTE)
|
||||
rp.append(pack_dint(val))
|
||||
else:
|
||||
# Cannot create a valid request packet
|
||||
return None
|
||||
|
||||
# At this point the Request Path is completed,
|
||||
if multi_requests:
|
||||
request_path = chr(len(rp)/2) + ''.join(rp)
|
||||
else:
|
||||
request_path = ''.join(rp)
|
||||
return request_path
|
||||
|
||||
|
||||
def build_common_packet_format(message_type, message, addr_type, addr_data=None, timeout=10):
|
||||
""" build_common_packet_format
|
||||
|
||||
It creates the common part for a CIP message. Check Volume 2 (page 2.22) of CIP specification for reference
|
||||
"""
|
||||
msg = pack_dint(0) # Interface Handle: shall be 0 for CIP
|
||||
msg += pack_uint(timeout) # timeout
|
||||
msg += pack_uint(2) # Item count: should be at list 2 (Address and Data)
|
||||
msg += addr_type # Address Item Type ID
|
||||
|
||||
if addr_data is not None:
|
||||
msg += pack_uint(len(addr_data)) # Address Item Length
|
||||
msg += addr_data
|
||||
else:
|
||||
msg += pack_uint(0) # Address Item Length
|
||||
msg += message_type # Data Type ID
|
||||
msg += pack_uint(len(message)) # Data Item Length
|
||||
msg += message
|
||||
return msg
|
||||
|
||||
|
||||
def build_multiple_service(rp_list, sequence=None):
|
||||
|
||||
mr = []
|
||||
if sequence is not None:
|
||||
mr.append(pack_uint(sequence))
|
||||
|
||||
mr.append(chr(TAG_SERVICES_REQUEST["Multiple Service Packet"])) # the Request Service
|
||||
mr.append(pack_usint(2)) # the Request Path Size length in word
|
||||
mr.append(CLASS_ID["8-bit"])
|
||||
mr.append(CLASS_CODE["Message Router"])
|
||||
mr.append(INSTANCE_ID["8-bit"])
|
||||
mr.append(pack_usint(1)) # Instance 1
|
||||
mr.append(pack_uint(len(rp_list))) # Number of service contained in the request
|
||||
|
||||
# Offset calculation
|
||||
offset = (len(rp_list) * 2) + 2
|
||||
for index, rp in enumerate(rp_list):
|
||||
if index == 0:
|
||||
mr.append(pack_uint(offset)) # Starting offset
|
||||
else:
|
||||
mr.append(pack_uint(offset))
|
||||
offset += len(rp)
|
||||
|
||||
for rp in rp_list:
|
||||
mr.append(rp)
|
||||
return mr
|
||||
|
||||
|
||||
def parse_multiple_request(message, tags, typ):
|
||||
""" parse_multi_request
|
||||
This function should be used to parse the message replayed to a multi request service rapped around the
|
||||
send_unit_data message.
|
||||
|
||||
|
||||
:param message: the full message returned from the PLC
|
||||
:param tags: The list of tags to be read
|
||||
:param typ: to specify if multi request service READ or WRITE
|
||||
:return: a list of tuple in the format [ (tag name, value, data type), ( tag name, value, data type) ].
|
||||
In case of error the tuple will be (tag name, None, None)
|
||||
"""
|
||||
offset = 50
|
||||
position = 50
|
||||
number_of_service_replies = unpack_uint(message[offset:offset+2])
|
||||
tag_list = []
|
||||
for index in range(number_of_service_replies):
|
||||
position += 2
|
||||
start = offset + unpack_uint(message[position:position+2])
|
||||
general_status = unpack_usint(message[start+2:start+3])
|
||||
|
||||
if general_status == 0:
|
||||
if typ == "READ":
|
||||
data_type = unpack_uint(message[start+4:start+6])
|
||||
try:
|
||||
value_begin = start + 6
|
||||
value_end = value_begin + DATA_FUNCTION_SIZE[I_DATA_TYPE[data_type]]
|
||||
value = message[value_begin:value_end]
|
||||
tag_list.append((tags[index],
|
||||
UNPACK_DATA_FUNCTION[I_DATA_TYPE[data_type]](value),
|
||||
I_DATA_TYPE[data_type]))
|
||||
except LookupError:
|
||||
tag_list.append((tags[index], None, None))
|
||||
else:
|
||||
tag_list.append((tags[index] + ('GOOD',)))
|
||||
else:
|
||||
if typ == "READ":
|
||||
tag_list.append((tags[index], None, None))
|
||||
else:
|
||||
tag_list.append((tags[index] + ('BAD',)))
|
||||
return tag_list
|
||||
|
||||
|
||||
class Socket:
|
||||
|
||||
def __init__(self, timeout=5.0):
|
||||
self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
self.sock.settimeout(timeout)
|
||||
self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
|
||||
|
||||
def connect(self, host, port):
|
||||
try:
|
||||
self.sock.connect((host, port))
|
||||
except socket.timeout:
|
||||
raise CommError("Socket timeout during connection.")
|
||||
|
||||
def send(self, msg, timeout=0):
|
||||
if timeout != 0:
|
||||
self.sock.settimeout(timeout)
|
||||
total_sent = 0
|
||||
while total_sent < len(msg):
|
||||
try:
|
||||
sent = self.sock.send(msg[total_sent:])
|
||||
if sent == 0:
|
||||
raise CommError("socket connection broken.")
|
||||
total_sent += sent
|
||||
except socket.error:
|
||||
raise CommError("socket connection broken.")
|
||||
return total_sent
|
||||
|
||||
def receive(self, timeout=0):
|
||||
if timeout != 0:
|
||||
self.sock.settimeout(timeout)
|
||||
msg_len = 28
|
||||
chunks = []
|
||||
bytes_recd = 0
|
||||
one_shot = True
|
||||
while bytes_recd < msg_len:
|
||||
try:
|
||||
chunk = self.sock.recv(min(msg_len - bytes_recd, 2048))
|
||||
if chunk == '':
|
||||
raise CommError("socket connection broken.")
|
||||
if one_shot:
|
||||
data_size = int(struct.unpack('<H', chunk[2:4])[0]) # Length
|
||||
msg_len = HEADER_SIZE + data_size
|
||||
one_shot = False
|
||||
|
||||
chunks.append(chunk)
|
||||
bytes_recd += len(chunk)
|
||||
except socket.error as e:
|
||||
raise CommError(e)
|
||||
return ''.join(chunks)
|
||||
|
||||
def close(self):
|
||||
self.sock.close()
|
||||
|
||||
|
||||
def parse_symbol_type(symbol):
|
||||
""" parse_symbol_type
|
||||
|
||||
It parse the symbol to Rockwell Spec
|
||||
:param symbol: the symbol associated to a tag
|
||||
:return: A tuple containing information about the tag
|
||||
"""
|
||||
pass
|
||||
|
||||
return None
|
||||
|
||||
|
||||
class Base(object):
|
||||
_sequence = 0
|
||||
|
||||
|
||||
def __init__(self):
|
||||
if Base._sequence == 0:
|
||||
Base._sequence = getpid()
|
||||
else:
|
||||
Base._sequence = Base._get_sequence()
|
||||
|
||||
self.__version__ = '0.3'
|
||||
self.__sock = None
|
||||
self.__direct_connections = False
|
||||
self._session = 0
|
||||
self._connection_opened = False
|
||||
self._reply = None
|
||||
self._message = None
|
||||
self._target_cid = None
|
||||
self._target_is_connected = False
|
||||
self._tag_list = []
|
||||
self._buffer = {}
|
||||
self._device_description = "Device Unknown"
|
||||
self._last_instance = 0
|
||||
self._byte_offset = 0
|
||||
self._last_position = 0
|
||||
self._more_packets_available = False
|
||||
self._last_tag_read = ()
|
||||
self._last_tag_write = ()
|
||||
self._status = (0, "")
|
||||
self._output_raw = False # indicating value should be output as raw (hex)
|
||||
|
||||
self.attribs = {'context': '_pycomm_', 'protocol version': 1, 'rpi': 5000, 'port': 0xAF12, 'timeout': 10,
|
||||
'backplane': 1, 'cpu slot': 0, 'option': 0, 'cid': '\x27\x04\x19\x71', 'csn': '\x27\x04',
|
||||
'vid': '\x09\x10', 'vsn': '\x09\x10\x19\x71', 'name': 'Base', 'ip address': None}
|
||||
|
||||
def __len__(self):
|
||||
return len(self.attribs)
|
||||
|
||||
def __getitem__(self, key):
|
||||
return self.attribs[key]
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
self.attribs[key] = value
|
||||
|
||||
def __delitem__(self, key):
|
||||
try:
|
||||
del self.attribs[key]
|
||||
except LookupError:
|
||||
pass
|
||||
|
||||
def __iter__(self):
|
||||
return iter(self.attribs)
|
||||
|
||||
def __contains__(self, item):
|
||||
return item in self.attribs
|
||||
|
||||
def _check_reply(self):
|
||||
raise Socket.ImplementationError("The method has not been implemented")
|
||||
|
||||
@staticmethod
|
||||
def _get_sequence():
|
||||
""" Increase and return the sequence used with connected messages
|
||||
|
||||
:return: The New sequence
|
||||
"""
|
||||
if Base._sequence < 65535:
|
||||
Base._sequence += 1
|
||||
else:
|
||||
Base._sequence = getpid()
|
||||
return Base._sequence
|
||||
|
||||
def nop(self):
|
||||
""" No replay command
|
||||
|
||||
A NOP provides a way for either an originator or target to determine if the TCP connection is still open.
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND['nop'], 0)
|
||||
self._send()
|
||||
|
||||
def __repr__(self):
|
||||
return self._device_description
|
||||
|
||||
def generate_cid(self):
|
||||
self.attribs['cid'] = '{0}{1}{2}{3}'.format(chr(random.randint(0, 255)), chr(random.randint(0, 255))
|
||||
, chr(random.randint(0, 255)), chr(random.randint(0, 255)))
|
||||
|
||||
def description(self):
|
||||
return self._device_description
|
||||
|
||||
def list_identity(self):
|
||||
""" ListIdentity command to locate and identify potential target
|
||||
|
||||
return true if the replay contains the device description
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND['list_identity'], 0)
|
||||
self._send()
|
||||
self._receive()
|
||||
if self._check_reply():
|
||||
try:
|
||||
self._device_description = self._reply[63:-1]
|
||||
return True
|
||||
except Exception as e:
|
||||
raise CommError(e)
|
||||
return False
|
||||
|
||||
def send_rr_data(self, msg):
|
||||
""" SendRRData transfer an encapsulated request/reply packet between the originator and target
|
||||
|
||||
:param msg: The message to be send to the target
|
||||
:return: the replay received from the target
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND["send_rr_data"], len(msg))
|
||||
self._message += msg
|
||||
self._send()
|
||||
self._receive()
|
||||
return self._check_reply()
|
||||
|
||||
def send_unit_data(self, msg):
|
||||
""" SendUnitData send encapsulated connected messages.
|
||||
|
||||
:param msg: The message to be send to the target
|
||||
:return: the replay received from the target
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND["send_unit_data"], len(msg))
|
||||
self._message += msg
|
||||
self._send()
|
||||
self._receive()
|
||||
return self._check_reply()
|
||||
|
||||
def get_status(self):
|
||||
""" Get the last status/error
|
||||
|
||||
This method can be used after any call to get any details in case of error
|
||||
:return: A tuple containing (error group, error message)
|
||||
"""
|
||||
return self._status
|
||||
|
||||
def clear(self):
|
||||
""" Clear the last status/error
|
||||
|
||||
:return: return am empty tuple
|
||||
"""
|
||||
self._status = (0, "")
|
||||
|
||||
def build_header(self, command, length):
|
||||
""" Build the encapsulate message header
|
||||
|
||||
The header is 24 bytes fixed length, and includes the command and the length of the optional data portion.
|
||||
|
||||
:return: the headre
|
||||
"""
|
||||
try:
|
||||
h = command # Command UINT
|
||||
h += pack_uint(length) # Length UINT
|
||||
h += pack_dint(self._session) # Session Handle UDINT
|
||||
h += pack_dint(0) # Status UDINT
|
||||
h += self.attribs['context'] # Sender Context 8 bytes
|
||||
h += pack_dint(self.attribs['option']) # Option UDINT
|
||||
return h
|
||||
except Exception as e:
|
||||
raise CommError(e)
|
||||
|
||||
def register_session(self):
|
||||
""" Register a new session with the communication partner
|
||||
|
||||
:return: None if any error, otherwise return the session number
|
||||
"""
|
||||
if self._session:
|
||||
return self._session
|
||||
|
||||
self._session = 0
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND['register_session'], 4)
|
||||
self._message += pack_uint(self.attribs['protocol version'])
|
||||
self._message += pack_uint(0)
|
||||
self._send()
|
||||
self._receive()
|
||||
if self._check_reply():
|
||||
self._session = unpack_dint(self._reply[4:8])
|
||||
logger.debug("Session ={0} has been registered.".format(print_bytes_line(self._reply[4:8])))
|
||||
return self._session
|
||||
|
||||
self._status = 'Warning ! the session has not been registered.'
|
||||
logger.warning(self._status)
|
||||
return None
|
||||
|
||||
def forward_open(self):
|
||||
""" CIP implementation of the forward open message
|
||||
|
||||
Refer to ODVA documentation Volume 1 3-5.5.2
|
||||
|
||||
:return: False if any error in the replayed message
|
||||
"""
|
||||
if self._session == 0:
|
||||
self._status = (4, "A session need to be registered before to call forward_open.")
|
||||
raise CommError("A session need to be registered before to call forward open")
|
||||
|
||||
forward_open_msg = [
|
||||
FORWARD_OPEN,
|
||||
pack_usint(2),
|
||||
CLASS_ID["8-bit"],
|
||||
CLASS_CODE["Connection Manager"], # Volume 1: 5-1
|
||||
INSTANCE_ID["8-bit"],
|
||||
CONNECTION_MANAGER_INSTANCE['Open Request'],
|
||||
PRIORITY,
|
||||
TIMEOUT_TICKS,
|
||||
pack_dint(0),
|
||||
self.attribs['cid'],
|
||||
self.attribs['csn'],
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
TIMEOUT_MULTIPLIER,
|
||||
'\x00\x00\x00',
|
||||
pack_dint(self.attribs['rpi'] * 1000),
|
||||
pack_uint(CONNECTION_PARAMETER['Default']),
|
||||
pack_dint(self.attribs['rpi'] * 1000),
|
||||
pack_uint(CONNECTION_PARAMETER['Default']),
|
||||
TRANSPORT_CLASS, # Transport Class
|
||||
# CONNECTION_SIZE['Backplane'],
|
||||
# pack_usint(self.attribs['backplane']),
|
||||
# pack_usint(self.attribs['cpu slot']),
|
||||
CLASS_ID["8-bit"],
|
||||
CLASS_CODE["Message Router"],
|
||||
INSTANCE_ID["8-bit"],
|
||||
pack_usint(1)
|
||||
]
|
||||
|
||||
if self.__direct_connections:
|
||||
forward_open_msg[20:1] = [
|
||||
CONNECTION_SIZE['Direct Network'],
|
||||
]
|
||||
else:
|
||||
forward_open_msg[20:3] = [
|
||||
CONNECTION_SIZE['Backplane'],
|
||||
pack_usint(self.attribs['backplane']),
|
||||
pack_usint(self.attribs['cpu slot'])
|
||||
]
|
||||
|
||||
if self.send_rr_data(
|
||||
build_common_packet_format(DATA_ITEM['Unconnected'], ''.join(forward_open_msg), ADDRESS_ITEM['UCMM'],)):
|
||||
self._target_cid = self._reply[44:48]
|
||||
self._target_is_connected = True
|
||||
return True
|
||||
self._status = (4, "forward_open returned False")
|
||||
return False
|
||||
|
||||
def forward_close(self):
|
||||
""" CIP implementation of the forward close message
|
||||
|
||||
Each connection opened with the froward open message need to be closed.
|
||||
Refer to ODVA documentation Volume 1 3-5.5.3
|
||||
|
||||
:return: False if any error in the replayed message
|
||||
"""
|
||||
|
||||
if self._session == 0:
|
||||
self._status = (5, "A session need to be registered before to call forward_close.")
|
||||
raise CommError("A session need to be registered before to call forward_close.")
|
||||
|
||||
forward_close_msg = [
|
||||
FORWARD_CLOSE,
|
||||
pack_usint(2),
|
||||
CLASS_ID["8-bit"],
|
||||
CLASS_CODE["Connection Manager"], # Volume 1: 5-1
|
||||
INSTANCE_ID["8-bit"],
|
||||
CONNECTION_MANAGER_INSTANCE['Open Request'],
|
||||
PRIORITY,
|
||||
TIMEOUT_TICKS,
|
||||
self.attribs['csn'],
|
||||
self.attribs['vid'],
|
||||
self.attribs['vsn'],
|
||||
# CONNECTION_SIZE['Backplane'],
|
||||
# '\x00', # Reserved
|
||||
# pack_usint(self.attribs['backplane']),
|
||||
# pack_usint(self.attribs['cpu slot']),
|
||||
CLASS_ID["8-bit"],
|
||||
CLASS_CODE["Message Router"],
|
||||
INSTANCE_ID["8-bit"],
|
||||
pack_usint(1)
|
||||
]
|
||||
|
||||
if self.__direct_connections:
|
||||
forward_close_msg[11:2] = [
|
||||
CONNECTION_SIZE['Direct Network'],
|
||||
'\x00'
|
||||
]
|
||||
else:
|
||||
forward_close_msg[11:4] = [
|
||||
CONNECTION_SIZE['Backplane'],
|
||||
'\x00',
|
||||
pack_usint(self.attribs['backplane']),
|
||||
pack_usint(self.attribs['cpu slot'])
|
||||
]
|
||||
|
||||
if self.send_rr_data(
|
||||
build_common_packet_format(DATA_ITEM['Unconnected'], ''.join(forward_close_msg), ADDRESS_ITEM['UCMM'])):
|
||||
self._target_is_connected = False
|
||||
return True
|
||||
self._status = (5, "forward_close returned False")
|
||||
logger.warning(self._status)
|
||||
return False
|
||||
|
||||
def un_register_session(self):
|
||||
""" Un-register a connection
|
||||
|
||||
"""
|
||||
self._message = self.build_header(ENCAPSULATION_COMMAND['unregister_session'], 0)
|
||||
self._send()
|
||||
self._session = None
|
||||
|
||||
def _send(self):
|
||||
"""
|
||||
socket send
|
||||
:return: true if no error otherwise false
|
||||
"""
|
||||
try:
|
||||
logger.debug(print_bytes_msg(self._message, '-------------- SEND --------------'))
|
||||
self.__sock.send(self._message)
|
||||
except Exception as e:
|
||||
# self.clean_up()
|
||||
raise CommError(e)
|
||||
|
||||
def _receive(self):
|
||||
"""
|
||||
socket receive
|
||||
:return: true if no error otherwise false
|
||||
"""
|
||||
try:
|
||||
self._reply = self.__sock.receive()
|
||||
logger.debug(print_bytes_msg(self._reply, '----------- RECEIVE -----------'))
|
||||
except Exception as e:
|
||||
# self.clean_up()
|
||||
raise CommError(e)
|
||||
|
||||
def open(self, ip_address, direct_connection=False):
|
||||
"""
|
||||
socket open
|
||||
:param: ip address to connect to and type of connection. By default direct connection is disabled
|
||||
:return: true if no error otherwise false
|
||||
"""
|
||||
# set type of connection needed
|
||||
self.__direct_connections = direct_connection
|
||||
|
||||
# handle the socket layer
|
||||
if not self._connection_opened:
|
||||
try:
|
||||
if self.__sock is None:
|
||||
self.__sock = Socket()
|
||||
self.__sock.connect(ip_address, self.attribs['port'])
|
||||
self._connection_opened = True
|
||||
self.attribs['ip address'] = ip_address
|
||||
self.generate_cid()
|
||||
if self.register_session() is None:
|
||||
self._status = (13, "Session not registered")
|
||||
return False
|
||||
|
||||
# not sure but maybe I can remove this because is used to clean up any previous unclosed connection
|
||||
self.forward_close()
|
||||
return True
|
||||
except Exception as e:
|
||||
# self.clean_up()
|
||||
raise CommError(e)
|
||||
|
||||
def close(self):
|
||||
"""
|
||||
socket close
|
||||
:return: true if no error otherwise false
|
||||
"""
|
||||
try:
|
||||
if self._target_is_connected:
|
||||
self.forward_close()
|
||||
if self._session != 0:
|
||||
self.un_register_session()
|
||||
if self.__sock:
|
||||
self.__sock.close()
|
||||
except Exception as e:
|
||||
raise CommError(e)
|
||||
|
||||
self.clean_up()
|
||||
|
||||
def clean_up(self):
|
||||
self.__sock = None
|
||||
self._target_is_connected = False
|
||||
self._session = 0
|
||||
self._connection_opened = False
|
||||
|
||||
def is_connected(self):
|
||||
return self._connection_opened
|
||||
483
daq_sample/pycomm-master/pycomm/cip/cip_const.py
Executable file
@@ -0,0 +1,483 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# cip_const.py - A set of structures and constants used to implement the Ethernet/IP protocol
|
||||
#
|
||||
#
|
||||
# Copyright (c) 2014 Agostino Ruscito <ruscito@gmail.com>
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
# of this software and associated documentation files (the "Software"), to deal
|
||||
# in the Software without restriction, including without limitation the rights
|
||||
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
# copies of the Software, and to permit persons to whom the Software is
|
||||
# furnished to do so, subject to the following conditions:
|
||||
#
|
||||
# The above copyright notice and this permission notice shall be included in all
|
||||
# copies or substantial portions of the Software.
|
||||
#
|
||||
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
# SOFTWARE.
|
||||
#
|
||||
|
||||
ELEMENT_ID = {
|
||||
"8-bit": '\x28',
|
||||
"16-bit": '\x29',
|
||||
"32-bit": '\x2a'
|
||||
}
|
||||
|
||||
CLASS_ID = {
|
||||
"8-bit": '\x20',
|
||||
"16-bit": '\x21',
|
||||
}
|
||||
|
||||
INSTANCE_ID = {
|
||||
"8-bit": '\x24',
|
||||
"16-bit": '\x25'
|
||||
}
|
||||
|
||||
ATTRIBUTE_ID = {
|
||||
"8-bit": '\x30',
|
||||
"16-bit": '\x31'
|
||||
}
|
||||
|
||||
# Path are combined as:
|
||||
# CLASS_ID + PATHS
|
||||
# For example PCCC path is CLASS_ID["8-bit"]+PATH["PCCC"] -> 0x20, 0x67, 0x24, 0x01.
|
||||
PATH = {
|
||||
'Connection Manager': '\x06\x24\x01',
|
||||
'Router': '\x02\x24\x01',
|
||||
'Backplane Data Type': '\x66\x24\x01',
|
||||
'PCCC': '\x67\x24\x01',
|
||||
'DHCP Channel A': '\xa6\x24\x01\x01\x2c\x01',
|
||||
'DHCP Channel B': '\xa6\x24\x01\x02\x2c\x01'
|
||||
}
|
||||
|
||||
ENCAPSULATION_COMMAND = { # Volume 2: 2-3.2 Command Field UINT 2 byte
|
||||
"nop": '\x00\x00',
|
||||
"list_targets": '\x01\x00',
|
||||
"list_services": '\x04\x00',
|
||||
"list_identity": '\x63\x00',
|
||||
"list_interfaces": '\x64\x00',
|
||||
"register_session": '\x65\x00',
|
||||
"unregister_session": '\x66\x00',
|
||||
"send_rr_data": '\x6F\x00',
|
||||
"send_unit_data": '\x70\x00'
|
||||
}
|
||||
|
||||
"""
|
||||
When a tag is created, an instance of the Symbol Object (Class ID 0x6B) is created
|
||||
inside the controller.
|
||||
|
||||
When a UDT is created, an instance of the Template object (Class ID 0x6C) is
|
||||
created to hold information about the structure makeup.
|
||||
"""
|
||||
CLASS_CODE = {
|
||||
"Message Router": '\x02', # Volume 1: 5-1
|
||||
"Symbol Object": '\x6b',
|
||||
"Template Object": '\x6c',
|
||||
"Connection Manager": '\x06' # Volume 1: 3-5
|
||||
}
|
||||
|
||||
CONNECTION_MANAGER_INSTANCE = {
|
||||
'Open Request': '\x01',
|
||||
'Open Format Rejected': '\x02',
|
||||
'Open Resource Rejected': '\x03',
|
||||
'Open Other Rejected': '\x04',
|
||||
'Close Request': '\x05',
|
||||
'Close Format Request': '\x06',
|
||||
'Close Other Request': '\x07',
|
||||
'Connection Timeout': '\x08'
|
||||
}
|
||||
|
||||
TAG_SERVICES_REQUEST = {
|
||||
"Read Tag": 0x4c,
|
||||
"Read Tag Fragmented": 0x52,
|
||||
"Write Tag": 0x4d,
|
||||
"Write Tag Fragmented": 0x53,
|
||||
"Read Modify Write Tag": 0x4e,
|
||||
"Multiple Service Packet": 0x0a,
|
||||
"Get Instance Attributes List": 0x55,
|
||||
"Get Attributes": 0x03,
|
||||
"Read Template": 0x4c,
|
||||
}
|
||||
|
||||
TAG_SERVICES_REPLY = {
|
||||
0xcc: "Read Tag",
|
||||
0xd2: "Read Tag Fragmented",
|
||||
0xcd: "Write Tag",
|
||||
0xd3: "Write Tag Fragmented",
|
||||
0xce: "Read Modify Write Tag",
|
||||
0x8a: "Multiple Service Packet",
|
||||
0xd5: "Get Instance Attributes List",
|
||||
0x83: "Get Attributes",
|
||||
0xcc: "Read Template"
|
||||
}
|
||||
|
||||
|
||||
I_TAG_SERVICES_REPLY = {
|
||||
"Read Tag": 0xcc,
|
||||
"Read Tag Fragmented": 0xd2,
|
||||
"Write Tag": 0xcd,
|
||||
"Write Tag Fragmented": 0xd3,
|
||||
"Read Modify Write Tag": 0xce,
|
||||
"Multiple Service Packet": 0x8a,
|
||||
"Get Instance Attributes List": 0xd5,
|
||||
"Get Attributes": 0x83,
|
||||
"Read Template": 0xcc
|
||||
}
|
||||
|
||||
|
||||
"""
|
||||
EtherNet/IP Encapsulation Error Codes
|
||||
|
||||
Standard CIP Encapsulation Error returned in the cip message header
|
||||
"""
|
||||
STATUS = {
|
||||
0x0000: "Success",
|
||||
0x0001: "The sender issued an invalid or unsupported encapsulation command",
|
||||
0x0002: "Insufficient memory",
|
||||
0x0003: "Poorly formed or incorrect data in the data portion",
|
||||
0x0064: "An originator used an invalid session handle when sending an encapsulation message to the target",
|
||||
0x0065: "The target received a message of invalid length",
|
||||
0x0069: "Unsupported Protocol Version"
|
||||
}
|
||||
|
||||
"""
|
||||
MSG Error Codes:
|
||||
|
||||
The following error codes have been taken from:
|
||||
|
||||
Rockwell Automation Publication
|
||||
1756-RM003P-EN-P - December 2014
|
||||
"""
|
||||
SERVICE_STATUS = {
|
||||
0x01: "Connection failure (see extended status)",
|
||||
0x02: "Insufficient resource",
|
||||
0x03: "Invalid value",
|
||||
0x04: "IOI syntax error. A syntax error was detected decoding the Request Path (see extended status)",
|
||||
0x05: "Destination unknown, class unsupported, instance \nundefined or structure element undefined (see extended status)",
|
||||
0x06: "Insufficient Packet Space",
|
||||
0x07: "Connection lost",
|
||||
0x08: "Service not supported",
|
||||
0x09: "Error in data segment or invalid attribute value",
|
||||
0x0A: "Attribute list error",
|
||||
0x0B: "State already exist",
|
||||
0x0C: "Object state conflict",
|
||||
0x0D: "Object already exist",
|
||||
0x0E: "Attribute not settable",
|
||||
0x0F: "Permission denied",
|
||||
0x10: "Device state conflict",
|
||||
0x11: "Reply data too large",
|
||||
0x12: "Fragmentation of a primitive value",
|
||||
0x13: "Insufficient command data",
|
||||
0x14: "Attribute not supported",
|
||||
0x15: "Too much data",
|
||||
0x1A: "Bridge request too large",
|
||||
0x1B: "Bridge response too large",
|
||||
0x1C: "Attribute list shortage",
|
||||
0x1D: "Invalid attribute list",
|
||||
0x1E: "Request service error",
|
||||
0x1F: "Connection related failure (see extended status)",
|
||||
0x22: "Invalid reply received",
|
||||
0x25: "Key segment error",
|
||||
0x26: "Invalid IOI error",
|
||||
0x27: "Unexpected attribute in list",
|
||||
0x28: "DeviceNet error - invalid member ID",
|
||||
0x29: "DeviceNet error - member not settable",
|
||||
0xD1: "Module not in run state",
|
||||
0xFB: "Message port not supported",
|
||||
0xFC: "Message unsupported data type",
|
||||
0xFD: "Message uninitialized",
|
||||
0xFE: "Message timeout",
|
||||
0xff: "General Error (see extended status)"
|
||||
}
|
||||
|
||||
EXTEND_CODES = {
|
||||
0x01: {
|
||||
0x0100: "Connection in use",
|
||||
0x0103: "Transport not supported",
|
||||
0x0106: "Ownership conflict",
|
||||
0x0107: "Connection not found",
|
||||
0x0108: "Invalid connection type",
|
||||
0x0109: "Invalid connection size",
|
||||
0x0110: "Module not configured",
|
||||
0x0111: "EPR not supported",
|
||||
0x0114: "Wrong module",
|
||||
0x0115: "Wrong device type",
|
||||
0x0116: "Wrong revision",
|
||||
0x0118: "Invalid configuration format",
|
||||
0x011A: "Application out of connections",
|
||||
0x0203: "Connection timeout",
|
||||
0x0204: "Unconnected message timeout",
|
||||
0x0205: "Unconnected send parameter error",
|
||||
0x0206: "Message too large",
|
||||
0x0301: "No buffer memory",
|
||||
0x0302: "Bandwidth not available",
|
||||
0x0303: "No screeners available",
|
||||
0x0305: "Signature match",
|
||||
0x0311: "Port not available",
|
||||
0x0312: "Link address not available",
|
||||
0x0315: "Invalid segment type",
|
||||
0x0317: "Connection not scheduled"
|
||||
},
|
||||
0x04: {
|
||||
0x0000: "Extended status out of memory",
|
||||
0x0001: "Extended status out of instances"
|
||||
},
|
||||
0x05: {
|
||||
0x0000: "Extended status out of memory",
|
||||
0x0001: "Extended status out of instances"
|
||||
},
|
||||
0x1F: {
|
||||
0x0203: "Connection timeout"
|
||||
},
|
||||
0xff: {
|
||||
0x7: "Wrong data type",
|
||||
0x2001: "Excessive IOI",
|
||||
0x2002: "Bad parameter value",
|
||||
0x2018: "Semaphore reject",
|
||||
0x201B: "Size too small",
|
||||
0x201C: "Invalid size",
|
||||
0x2100: "Privilege failure",
|
||||
0x2101: "Invalid keyswitch position",
|
||||
0x2102: "Password invalid",
|
||||
0x2103: "No password issued",
|
||||
0x2104: "Address out of range",
|
||||
0x2105: "Address and how many out of range",
|
||||
0x2106: "Data in use",
|
||||
0x2107: "Type is invalid or not supported",
|
||||
0x2108: "Controller in upload or download mode",
|
||||
0x2109: "Attempt to change number of array dimensions",
|
||||
0x210A: "Invalid symbol name",
|
||||
0x210B: "Symbol does not exist",
|
||||
0x210E: "Search failed",
|
||||
0x210F: "Task cannot start",
|
||||
0x2110: "Unable to write",
|
||||
0x2111: "Unable to read",
|
||||
0x2112: "Shared routine not editable",
|
||||
0x2113: "Controller in faulted mode",
|
||||
0x2114: "Run mode inhibited"
|
||||
|
||||
}
|
||||
}
|
||||
DATA_ITEM = {
|
||||
'Connected': '\xb1\x00',
|
||||
'Unconnected': '\xb2\x00'
|
||||
}
|
||||
|
||||
ADDRESS_ITEM = {
|
||||
'Connection Based': '\xa1\x00',
|
||||
'Null': '\x00\x00',
|
||||
'UCMM': '\x00\x00'
|
||||
}
|
||||
|
||||
UCMM = {
|
||||
'Interface Handle': 0,
|
||||
'Item Count': 2,
|
||||
'Address Type ID': 0,
|
||||
'Address Length': 0,
|
||||
'Data Type ID': 0x00b2
|
||||
}
|
||||
|
||||
CONNECTION_SIZE = {
|
||||
'Backplane': '\x03', # CLX
|
||||
'Direct Network': '\x02'
|
||||
}
|
||||
|
||||
HEADER_SIZE = 24
|
||||
EXTENDED_SYMBOL = '\x91'
|
||||
BOOL_ONE = 0xff
|
||||
REQUEST_SERVICE = 0
|
||||
REQUEST_PATH_SIZE = 1
|
||||
REQUEST_PATH = 2
|
||||
SUCCESS = 0
|
||||
INSUFFICIENT_PACKETS = 6
|
||||
OFFSET_MESSAGE_REQUEST = 40
|
||||
|
||||
|
||||
FORWARD_CLOSE = '\x4e'
|
||||
UNCONNECTED_SEND = '\x52'
|
||||
FORWARD_OPEN = '\x54'
|
||||
LARGE_FORWARD_OPEN = '\x5b'
|
||||
GET_CONNECTION_DATA = '\x56'
|
||||
SEARCH_CONNECTION_DATA = '\x57'
|
||||
GET_CONNECTION_OWNER = '\x5a'
|
||||
MR_SERVICE_SIZE = 2
|
||||
|
||||
PADDING_BYTE = '\x00'
|
||||
PRIORITY = '\x0a'
|
||||
TIMEOUT_TICKS = '\x05'
|
||||
TIMEOUT_MULTIPLIER = '\x01'
|
||||
TRANSPORT_CLASS = '\xa3'
|
||||
|
||||
CONNECTION_PARAMETER = {
|
||||
'PLC5': 0x4302,
|
||||
'SLC500': 0x4302,
|
||||
'CNET': 0x4320,
|
||||
'DHP': 0x4302,
|
||||
'Default': 0x43f8,
|
||||
}
|
||||
|
||||
"""
|
||||
Atomic Data Type:
|
||||
|
||||
Bit = Bool
|
||||
Bit array = DWORD (32-bit boolean aray)
|
||||
8-bit integer = SINT
|
||||
16-bit integer = UINT
|
||||
32-bit integer = DINT
|
||||
32-bit float = REAL
|
||||
64-bit integer = LINT
|
||||
|
||||
From Rockwell Automation Publication 1756-PM020C-EN-P November 2012:
|
||||
When reading a BOOL tag, the values returned for 0 and 1 are 0 and 0xff, respectively.
|
||||
"""
|
||||
|
||||
S_DATA_TYPE = {
|
||||
'BOOL': 0xc1,
|
||||
'SINT': 0xc2, # Signed 8-bit integer
|
||||
'INT': 0xc3, # Signed 16-bit integer
|
||||
'DINT': 0xc4, # Signed 32-bit integer
|
||||
'LINT': 0xc5, # Signed 64-bit integer
|
||||
'USINT': 0xc6, # Unsigned 8-bit integer
|
||||
'UINT': 0xc7, # Unsigned 16-bit integer
|
||||
'UDINT': 0xc8, # Unsigned 32-bit integer
|
||||
'ULINT': 0xc9, # Unsigned 64-bit integer
|
||||
'REAL': 0xca, # 32-bit floating point
|
||||
'LREAL': 0xcb, # 64-bit floating point
|
||||
'STIME': 0xcc, # Synchronous time
|
||||
'DATE': 0xcd,
|
||||
'TIME_OF_DAY': 0xce,
|
||||
'DATE_AND_TIME': 0xcf,
|
||||
'STRING': 0xd0, # character string (1 byte per character)
|
||||
'BYTE': 0xd1, # byte string 8-bits
|
||||
'WORD': 0xd2, # byte string 16-bits
|
||||
'DWORD': 0xd3, # byte string 32-bits
|
||||
'LWORD': 0xd4, # byte string 64-bits
|
||||
'STRING2': 0xd5, # character string (2 byte per character)
|
||||
'FTIME': 0xd6, # Duration high resolution
|
||||
'LTIME': 0xd7, # Duration long
|
||||
'ITIME': 0xd8, # Duration short
|
||||
'STRINGN': 0xd9, # character string (n byte per character)
|
||||
'SHORT_STRING': 0xda, # character string (1 byte per character, 1 byte length indicator)
|
||||
'TIME': 0xdb, # Duration in milliseconds
|
||||
'EPATH': 0xdc, # CIP Path segment
|
||||
'ENGUNIT': 0xdd, # Engineering Units
|
||||
'STRINGI': 0xde # International character string
|
||||
}
|
||||
|
||||
I_DATA_TYPE = {
|
||||
0xc1: 'BOOL',
|
||||
0xc2: 'SINT', # Signed 8-bit integer
|
||||
0xc3: 'INT', # Signed 16-bit integer
|
||||
0xc4: 'DINT', # Signed 32-bit integer
|
||||
0xc5: 'LINT', # Signed 64-bit integer
|
||||
0xc6: 'USINT', # Unsigned 8-bit integer
|
||||
0xc7: 'UINT', # Unsigned 16-bit integer
|
||||
0xc8: 'UDINT', # Unsigned 32-bit integer
|
||||
0xc9: 'ULINT', # Unsigned 64-bit integer
|
||||
0xca: 'REAL', # 32-bit floating point
|
||||
0xcb: 'LREAL', # 64-bit floating point
|
||||
0xcc: 'STIME', # Synchronous time
|
||||
0xcd: 'DATE',
|
||||
0xce: 'TIME_OF_DAY',
|
||||
0xcf: 'DATE_AND_TIME',
|
||||
0xd0: 'STRING', # character string (1 byte per character)
|
||||
0xd1: 'BYTE', # byte string 8-bits
|
||||
0xd2: 'WORD', # byte string 16-bits
|
||||
0xd3: 'DWORD', # byte string 32-bits
|
||||
0xd4: 'LWORD', # byte string 64-bits
|
||||
0xd5: 'STRING2', # character string (2 byte per character)
|
||||
0xd6: 'FTIME', # Duration high resolution
|
||||
0xd7: 'LTIME', # Duration long
|
||||
0xd8: 'ITIME', # Duration short
|
||||
0xd9: 'STRINGN', # character string (n byte per character)
|
||||
0xda: 'SHORT_STRING', # character string (1 byte per character, 1 byte length indicator)
|
||||
0xdb: 'TIME', # Duration in milliseconds
|
||||
0xdc: 'EPATH', # CIP Path segment
|
||||
0xdd: 'ENGUNIT', # Engineering Units
|
||||
0xde: 'STRINGI' # International character string
|
||||
}
|
||||
|
||||
REPLAY_INFO = {
|
||||
0x4e: 'FORWARD_CLOSE (4E,00)',
|
||||
0x52: 'UNCONNECTED_SEND (52,00)',
|
||||
0x54: 'FORWARD_OPEN (54,00)',
|
||||
0x6f: 'send_rr_data (6F,00)',
|
||||
0x70: 'send_unit_data (70,00)',
|
||||
0x00: 'nop',
|
||||
0x01: 'list_targets',
|
||||
0x04: 'list_services',
|
||||
0x63: 'list_identity',
|
||||
0x64: 'list_interfaces',
|
||||
0x65: 'register_session',
|
||||
0x66: 'unregister_session',
|
||||
}
|
||||
|
||||
PCCC_DATA_TYPE = {
|
||||
'N': '\x89',
|
||||
'B': '\x85',
|
||||
'T': '\x86',
|
||||
'C': '\x87',
|
||||
'S': '\x84',
|
||||
'F': '\x8a',
|
||||
'ST': '\x8d',
|
||||
'A': '\x8e',
|
||||
'R': '\x88',
|
||||
'O': '\x8b',
|
||||
'I': '\x8c'
|
||||
}
|
||||
|
||||
PCCC_DATA_SIZE = {
|
||||
'N': 2,
|
||||
# 'L': 4,
|
||||
'B': 2,
|
||||
'T': 6,
|
||||
'C': 6,
|
||||
'S': 2,
|
||||
'F': 4,
|
||||
'ST': 84,
|
||||
'A': 2,
|
||||
'R': 6,
|
||||
'O': 2,
|
||||
'I': 2
|
||||
}
|
||||
|
||||
PCCC_CT = {
|
||||
'PRE': 1,
|
||||
'ACC': 2,
|
||||
'EN': 15,
|
||||
'TT': 14,
|
||||
'DN': 13,
|
||||
'CU': 15,
|
||||
'CD': 14,
|
||||
'OV': 12,
|
||||
'UN': 11,
|
||||
'UA': 10
|
||||
}
|
||||
|
||||
PCCC_ERROR_CODE = {
|
||||
-2: "Not Acknowledged (NAK)",
|
||||
-3: "No Reponse, Check COM Settings",
|
||||
-4: "Unknown Message from DataLink Layer",
|
||||
-5: "Invalid Address",
|
||||
-6: "Could Not Open Com Port",
|
||||
-7: "No data specified to data link layer",
|
||||
-8: "No data returned from PLC",
|
||||
-20: "No Data Returned",
|
||||
16: "Illegal Command or Format, Address may not exist or not enough elements in data file",
|
||||
32: "PLC Has a Problem and Will Not Communicate",
|
||||
48: "Remote Node Host is Missing, Disconnected, or Shut Down",
|
||||
64: "Host Could Not Complete Function Due To Hardware Fault",
|
||||
80: "Addressing problem or Memory Protect Rungs",
|
||||
96: "Function not allows due to command protection selection",
|
||||
112: "Processor is in Program mode",
|
||||
128: "Compatibility mode file missing or communication zone problem",
|
||||
144: "Remote node cannot buffer command",
|
||||
240: "Error code in EXT STS Byte"
|
||||
}
|
||||
8
daq_sample/pycomm-master/pycomm/common.py
Executable file
@@ -0,0 +1,8 @@
|
||||
__author__ = 'Agostino Ruscito'
|
||||
__version__ = "1.0.8"
|
||||
__date__ = "08 03 2015"
|
||||
|
||||
class PycommError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
37
daq_sample/pycomm-master/setup.py
Executable file
@@ -0,0 +1,37 @@
|
||||
from distutils.core import setup
|
||||
from pycomm import common
|
||||
import os
|
||||
|
||||
|
||||
def read(file_name):
|
||||
return open(os.path.join(os.path.dirname(__file__), file_name)).read()
|
||||
|
||||
setup(
|
||||
name="pycomm",
|
||||
author="Agostino Ruscito",
|
||||
author_email="uscito@gmail.com",
|
||||
version=common.__version__,
|
||||
description="A PLC communication library for Python",
|
||||
long_description=read('README.rst'),
|
||||
license="MIT",
|
||||
url="https://github.com/ruscito/pycomm",
|
||||
packages=[
|
||||
"pycomm",
|
||||
"pycomm.ab_comm",
|
||||
"pycomm.cip"
|
||||
],
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Intended Audience :: Developers',
|
||||
'Natural Language :: English',
|
||||
'License :: OSI Approved :: MIT License',
|
||||
'Operating System :: OS Independent',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 2',
|
||||
'Programming Language :: Python :: 2.6',
|
||||
'Programming Language :: Python :: 2.7',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.3',
|
||||
'Topic :: Software Development :: Libraries :: Python Modules',
|
||||
],
|
||||
)
|
||||
1
daq_sample/pycomm_helper
Submodule
@@ -1,13 +1,13 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
'''
|
||||
MySQL Tag Server
|
||||
Sample Tag generator
|
||||
Created on April 7, 2016
|
||||
@author: Patrick McDonagh
|
||||
@description: Continuously loops through a list of tags to store values from a PLC into a MySQL database
|
||||
'''
|
||||
|
||||
from tag.tag import Tag
|
||||
from pycomm_helper.tag import Tag
|
||||
import traceback
|
||||
import time
|
||||
import random
|
||||
@@ -15,7 +15,8 @@ import requests
|
||||
import json
|
||||
|
||||
# DEFAULTS
|
||||
web_address = "https://localhost:3000"
|
||||
db_address = "10.10.10.10:5000"
|
||||
db_url = "https://{}".format(db_address)
|
||||
scan_rate = 30 # seconds
|
||||
save_all = "test" # use True, False, or any string
|
||||
|
||||
@@ -57,43 +58,41 @@ tag_store = {}
|
||||
|
||||
|
||||
def main():
|
||||
global web_address, scan_rate, save_all
|
||||
global db_address, scan_rate, save_all
|
||||
try:
|
||||
# Get tags stored in database
|
||||
get_tag_request_data = {'where': '{"tag_class": 5}'}
|
||||
get_tag_request = requests.get('{}/tag'.format(web_address), params=get_tag_request_data, verify=False)
|
||||
tags = json.loads(get_tag_request.text)
|
||||
except Exception, e:
|
||||
get_tag_request = requests.get('{}/api/tags'.format(db_url), verify=False)
|
||||
tags = json.loads(get_tag_request.text)['objects']
|
||||
except Exception as e:
|
||||
print("Error getting tags: {}".format(e))
|
||||
time.sleep(10)
|
||||
main()
|
||||
|
||||
try:
|
||||
sr_req_data = 'where={"parameter": "scan_rate"}'
|
||||
sr_req = requests.get('{}/config?{}'.format(web_address, sr_req_data), verify=False)
|
||||
sr_req = requests.get('{}/config?{}'.format(db_url, sr_req_data), verify=False)
|
||||
sr_try = json.loads(sr_req.text)
|
||||
if len(sr_try) > 0:
|
||||
scan_rate = int(sr_try[0]['val'])
|
||||
except Exception, e:
|
||||
except Exception as e:
|
||||
print("Error getting scan rage: {}".format(e))
|
||||
print("I'll just use {} seconds as the scan rate...".format(scan_rate))
|
||||
|
||||
try:
|
||||
sa_req_data = {"where": {"parameter": "save_all"}}
|
||||
sa_req = requests.get('{}/config'.format(web_address), params=sa_req_data, verify=False)
|
||||
sa_req = requests.get('{}/config'.format(db_url), params=sa_req_data, verify=False)
|
||||
sa_try = json.loads(sa_req.text)
|
||||
if len(sa_try) > 0:
|
||||
if sa_try[0]['val'].lower() == "true":
|
||||
save_all = True
|
||||
elif sa_try[0]['val'].lower() == "false":
|
||||
save_all = False
|
||||
except Exception, e:
|
||||
except Exception as e:
|
||||
print("Error getting save-all: {}".format(e))
|
||||
print("I'll just use {} as the save-all parameter...".format(save_all))
|
||||
|
||||
for t in tags:
|
||||
# name, tag, db_id, data_type, change_threshold, guarantee_sec, mapFn=None, device_type='CLX', ip_address='192.168.1.10'):
|
||||
tag_store[t['name']] = Sample(t['name'], t['tag'], t['id'], t['data_type'], t['change_threshold'], t['guarantee_sec'], mapFn=t['map_function'], ip_address=t['deviceID']['address'])
|
||||
tag_store[t['name']] = Sample(t['name'], t['tag'], t['id'], t['data_type_id'], t['change_threshold'], t['guarantee_sec'], mapFn=t['map_function'], ip_address=t['device']['address'], db_address=db_address)
|
||||
|
||||
while True:
|
||||
for tag in tag_store:
|
||||
@@ -1,101 +0,0 @@
|
||||
CREATE DATABASE poconsole;
|
||||
USE poconsole;
|
||||
CREATE TABLE IF NOT EXISTS tag_classes(
|
||||
id INT NOT NULL AUTO_INCREMENT,
|
||||
tag_class varchar(64),
|
||||
description varchar(64),
|
||||
createdAt DATETIME,
|
||||
updatedAt DATETIME,
|
||||
PRIMARY KEY (id)
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS device_types(
|
||||
id INT NOT NULL AUTO_INCREMENT,
|
||||
dType VARCHAR(64),
|
||||
createdAt DATETIME,
|
||||
updatedAt DATETIME,
|
||||
PRIMARY KEY (id)
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS devices(
|
||||
id INT NOT NULL AUTO_INCREMENT,
|
||||
name varchar(64),
|
||||
device_type INT,
|
||||
address VARCHAR(64),
|
||||
createdAt DATETIME,
|
||||
updatedAt DATETIME,
|
||||
PRIMARY KEY (id),
|
||||
INDEX device_type_ind (device_type),
|
||||
FOREIGN KEY (device_type)
|
||||
REFERENCES device_types(id)
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS tags(
|
||||
id INT NOT NULL AUTO_INCREMENT,
|
||||
name varchar(128),
|
||||
class INT,
|
||||
tag varchar(128),
|
||||
deviceID INT,
|
||||
description varchar(128),
|
||||
data_type varchar(32),
|
||||
change_threshold float,
|
||||
guarantee_sec INT,
|
||||
map_function varchar(64),
|
||||
units varchar(64),
|
||||
minExpected varchar(64),
|
||||
maxExpected varchar(64),
|
||||
createdAt DATETIME,
|
||||
updatedAt DATETIME,
|
||||
PRIMARY KEY (id),
|
||||
INDEX class_ind (class),
|
||||
FOREIGN KEY (class)
|
||||
REFERENCES tag_classes(id)
|
||||
ON DELETE CASCADE,
|
||||
INDEX deviceID_ind (deviceID),
|
||||
FOREIGN KEY (deviceID)
|
||||
REFERENCES devices(id)
|
||||
ON DELETE CASCADE
|
||||
);
|
||||
|
||||
|
||||
CREATE TABLE IF NOT EXISTS tag_vals(
|
||||
id INT NOT NULL AUTO_INCREMENT,
|
||||
tagID int,
|
||||
val float,
|
||||
createdAt DATETIME,
|
||||
updatedAt DATETIME,
|
||||
PRIMARY KEY (id),
|
||||
INDEX tagID_ind (tagID),
|
||||
FOREIGN KEY (tagID)
|
||||
REFERENCES tags(id)
|
||||
ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS config (
|
||||
id INT NOT NULL AUTO_INCREMENT,
|
||||
parameter varchar(128),
|
||||
val varchar(128),
|
||||
createdAt DATETIME,
|
||||
updatedAt DATETIME,
|
||||
PRIMARY KEY (id)
|
||||
);
|
||||
|
||||
INSERT INTO poconsole.tag_classes (id, tag_class, description) VALUES (1, 'stroke', 'Stroke Information');
|
||||
INSERT INTO poconsole.tag_classes (id, tag_class, description) VALUES (2, 'history', 'Historical Data');
|
||||
INSERT INTO poconsole.tag_classes (id, tag_class, description) VALUES (3, 'gaugeoff', 'Gauge Off Data');
|
||||
INSERT INTO poconsole.tag_classes (id, tag_class, description) VALUES (4, 'welltest', 'Well Test Data');
|
||||
INSERT INTO poconsole.tag_classes (id, tag_class, description) VALUES (5, 'custom', 'Custom tags');
|
||||
|
||||
INSERT INTO poconsole.device_types (id, dType) VALUES (1, "CLX");
|
||||
INSERT INTO poconsole.device_types (id, dType) VALUES (2, "Micro800");
|
||||
INSERT INTO poconsole.device_types (id, dType) VALUES (3, "E300");
|
||||
-- INSERT INTO poconsole.device_types (id, dType) VALUES (4, "PF755");
|
||||
|
||||
|
||||
CREATE USER 'website'@'localhost' IDENTIFIED BY 'henrypump';
|
||||
GRANT ALL ON *.* TO 'website'@'localhost';
|
||||
CREATE USER 'admin'@'localhost' IDENTIFIED BY 'henrypump';
|
||||
GRANT ALL ON *.* to 'admin'@'localhost';
|
||||
CREATE USER 'admin'@'%' IDENTIFIED BY 'henrypump';
|
||||
GRANT ALL ON *.* to 'admin'@'%';
|
||||
FLUSH PRIVILEGES;
|
||||
@@ -1,42 +0,0 @@
|
||||
CREATE TABLE IF NOT EXISTS tag_classes(
|
||||
id INTEGER PRIMARY KEY,
|
||||
tag_class TEXT,
|
||||
description TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS tags (
|
||||
id INTEGER PRIMARY KEY,
|
||||
name TEXT,
|
||||
class TEXT,
|
||||
tag TEXT,
|
||||
description TEXT,
|
||||
data_type TEXT,
|
||||
change_threshold REAL,
|
||||
guarantee_sec INTEGER,
|
||||
map_function TEXT,
|
||||
units TEXT,
|
||||
minExpected REAL,
|
||||
maxExpected REAL,
|
||||
dateAdded TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
deleted INTEGER DEFAULT 0
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS tag_vals (
|
||||
id INTEGER PRIMARY KEY,
|
||||
tagID INTEGER,
|
||||
val REAL,
|
||||
dtime TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS config (
|
||||
id INTEGER PRIMARY KEY,
|
||||
parameter TEXT,
|
||||
val TEXT,
|
||||
dateAdded TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
INSERT INTO tag_classes (id, tag_class, description) VALUES (1, 'stroke', 'Stroke Information');
|
||||
INSERT INTO tag_classes (id, tag_class, description) VALUES (2, 'history', 'Historical Data');
|
||||
INSERT INTO tag_classes (id, tag_class, description) VALUES (3, 'gaugeoff', 'Gauge Off Data');
|
||||
INSERT INTO tag_classes (id, tag_class, description) VALUES (4, 'welltest', 'Well Test Data');
|
||||
INSERT INTO tag_classes (id, tag_class, description) VALUES (5, 'custom', 'Custom tags');
|
||||
26
docker-compose.yml
Normal file
@@ -0,0 +1,26 @@
|
||||
version : '2'
|
||||
services:
|
||||
web_db:
|
||||
image: henrypump/logger/web_db
|
||||
ports:
|
||||
- "443:5000"
|
||||
networks:
|
||||
poconsole:
|
||||
ipv4_address: 10.10.10.10
|
||||
daq_sample:
|
||||
image: henrypump/logger/daq_sample
|
||||
networks:
|
||||
- poconsole
|
||||
depends_on:
|
||||
- web_db
|
||||
|
||||
|
||||
networks:
|
||||
poconsole:
|
||||
driver: bridge
|
||||
driver_opts:
|
||||
com.docker.network.enable_ipv4: "true"
|
||||
ipam:
|
||||
driver: default
|
||||
config:
|
||||
- subnet: 10.10.10.0/24
|
||||
@@ -1,37 +0,0 @@
|
||||
#! /bin/sh
|
||||
# /etc/init.d/tagserver
|
||||
|
||||
### BEGIN INIT INFO
|
||||
# Provides: tagserver
|
||||
# Required-Start: $remote_fs $syslog
|
||||
# Required-Stop: $remote_fs $syslog
|
||||
# Default-Start: 2 3 4 5
|
||||
# Default-Stop: 0 1 6
|
||||
# Short-Description: Simple script to start a program at boot
|
||||
# Description: A simple script from www.stuffaboutcode.com which will start / stop a program a boot / shutdown.
|
||||
### END INIT INFO
|
||||
|
||||
# If you want a command to always run, put it here
|
||||
|
||||
# Carry out specific functions when asked to by the system
|
||||
case "$1" in
|
||||
start)
|
||||
echo "Starting loggers"
|
||||
kill -9 $(cat /root/tagserver.pid)
|
||||
# run application you want to start
|
||||
/usr/bin/python /root/tag-server/tagserver.py > /dev/null 2>&1 & echo $! > "/root/tagserver.pid"
|
||||
|
||||
;;
|
||||
stop)
|
||||
echo "Stopping loggers"
|
||||
# kill application you want to stop
|
||||
kill -9 $(cat /root/tagserver.pid)
|
||||
|
||||
;;
|
||||
*)
|
||||
echo "Usage: /etc/init.d/tagserver {start|stop}"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
exit 0
|
||||
@@ -1,18 +0,0 @@
|
||||
(dp0
|
||||
S'host'
|
||||
p1
|
||||
S'127.0.0.1'
|
||||
p2
|
||||
sS'password'
|
||||
p3
|
||||
S'henrypump'
|
||||
p4
|
||||
sS'user'
|
||||
p5
|
||||
S'website'
|
||||
p6
|
||||
sS'database'
|
||||
p7
|
||||
S'poconsole'
|
||||
p8
|
||||
s.
|
||||
@@ -1,11 +0,0 @@
|
||||
import pickle
|
||||
|
||||
mysql_cfg = {
|
||||
'host': '127.0.0.1',
|
||||
'user': 'website',
|
||||
'password': 'henrypump',
|
||||
'database': 'poconsole'
|
||||
}
|
||||
|
||||
with open('mysql_cfg.pickle', 'wb') as pickleconfig:
|
||||
pickle.dump(mysql_cfg, pickleconfig)
|
||||
1
tag
@@ -1,97 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
'''
|
||||
MySQL Tag Server
|
||||
Created on April 7, 2016
|
||||
@author: Patrick McDonagh
|
||||
@description: Continuously loops through a list of tags to store values from a PLC into a MySQL database
|
||||
'''
|
||||
import mysql.connector as mysqlcon
|
||||
import pickle
|
||||
from tag.tag_mysql import Tag
|
||||
import traceback
|
||||
import time
|
||||
import os
|
||||
|
||||
with open(os.path.realpath('.') + '/mysql_cfg.pickle', 'rb') as pickleconfig:
|
||||
mysql_cfg = pickle.load(pickleconfig)
|
||||
|
||||
if mysql_cfg:
|
||||
db = mysqlcon.connect(**mysql_cfg)
|
||||
|
||||
tag_store = {}
|
||||
configProperties = {}
|
||||
|
||||
def main():
|
||||
db.connect()
|
||||
cur = db.cursor()
|
||||
query = "SELECT * FROM tags WHERE class = 5 AND deleted = 0"
|
||||
cur.execute(query)
|
||||
tags = cur.fetchall()
|
||||
print tags
|
||||
# [(1, u'Century Counter Up', 5, u'Century_Counter_Up', u'REAL', 10.0, 3600, None, 0)]
|
||||
db.disconnect()
|
||||
|
||||
|
||||
configObj = {}
|
||||
db.connect()
|
||||
cur = db.cursor()
|
||||
query = "SELECT parameter, val FROM config GROUP BY parameter;"
|
||||
cur.execute(query)
|
||||
config = cur.fetchall()
|
||||
db.disconnect()
|
||||
for x in config:
|
||||
configObj[x[0]] = x[1]
|
||||
|
||||
try:
|
||||
configProperties['PLC_IP_ADDRESS'] = str(configObj['ip_address'])
|
||||
print("FYI, using PLC IP Address from the database {0}".format(configProperties['PLC_IP_ADDRESS']))
|
||||
except KeyError:
|
||||
print("FYI, there is no PLC IP Address stored in the database, defaulting to 192.168.1.10")
|
||||
configProperties['PLC_IP_ADDRESS'] = "192.168.1.10"
|
||||
|
||||
try:
|
||||
configProperties['plc_type'] = str(configObj['plc_type'])
|
||||
print("FYI, using PLC Type from the database {0}".format(configProperties['plc_type']))
|
||||
except KeyError:
|
||||
print("FYI, there is no PLC Type stored in the database, defaulting to CLX")
|
||||
configProperties['plc_type'] = "CLX"
|
||||
|
||||
try:
|
||||
configProperties['scan_rate'] = int(configObj['scan_rate'])
|
||||
print("FYI, using Scan Rate from the database {0}".format(configProperties['scan_rate']))
|
||||
except KeyError:
|
||||
print("FYI, there is no Scan Rate stored in the database, defaulting to 10 seconds")
|
||||
configProperties['scan_rate'] = 10
|
||||
|
||||
try:
|
||||
sa_test = str(configObj['save_all'])
|
||||
if sa_test.lower() == "true":
|
||||
configProperties['save_all'] = True
|
||||
elif sa_test.lower() == "false":
|
||||
configProperties['save_all'] = False
|
||||
else:
|
||||
configProperties['save_all'] = "test"
|
||||
print("FYI, value for save_all is {0}".format(configProperties['save_all']))
|
||||
except KeyError:
|
||||
print("FYI, there is no save_all value stored in the database, using 'test'")
|
||||
configProperties['save_all'] = 'test'
|
||||
|
||||
|
||||
|
||||
|
||||
for t in tags:
|
||||
tag_store[t[1]] = Tag(t[1], t[3], t[0], t[5], t[6], t[7], mapFn=t[8], device_type=configProperties['plc_type'], ip_address=configProperties['PLC_IP_ADDRESS'])
|
||||
|
||||
|
||||
while True:
|
||||
for tag in tag_store:
|
||||
try:
|
||||
tag_store[tag].read(configProperties['save_all'])
|
||||
except:
|
||||
print("ERROR EVALUATING {}".format(tag))
|
||||
traceback.print_exc()
|
||||
time.sleep(configProperties['scan_rate'])
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
@@ -1,96 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
'''
|
||||
Created on Dec 8, 2015
|
||||
|
||||
@author: Patrick McDonagh
|
||||
'''
|
||||
|
||||
import time
|
||||
import sqlite3 as lite
|
||||
from tag.tag_sqlite import Tag
|
||||
import traceback
|
||||
|
||||
# con = lite.connect("/usr/db/data.db")
|
||||
con = lite.connect('/mnt/usb/data.db')
|
||||
|
||||
configProperties = {}
|
||||
|
||||
|
||||
def main():
|
||||
|
||||
|
||||
with con:
|
||||
cur = con.cursor()
|
||||
query = "SELECT * FROM tags WHERE deleted = 0;"
|
||||
cur.execute(query)
|
||||
tags = cur.fetchall()
|
||||
|
||||
configObj = {}
|
||||
|
||||
|
||||
with con:
|
||||
cur = con.cursor()
|
||||
query = "SELECT parameter, val FROM config GROUP BY parameter;"
|
||||
cur.execute(query)
|
||||
config = cur.fetchall()
|
||||
for x in config:
|
||||
configObj[x[0]] = x[1]
|
||||
|
||||
try:
|
||||
configProperties['PLC_IP_ADDRESS'] = str(configObj['ip_address'])
|
||||
print("FYI, using PLC IP Address from the database {0}".format(configProperties['PLC_IP_ADDRESS']))
|
||||
except KeyError:
|
||||
print("FYI, there is no PLC IP Address stored in the database, defaulting to 192.168.1.10")
|
||||
configProperties['PLC_IP_ADDRESS'] = "192.168.1.10"
|
||||
|
||||
try:
|
||||
configProperties['plc_type'] = str(configObj['plc_type'])
|
||||
print("FYI, using PLC Type from the database {0}".format(configProperties['plc_type']))
|
||||
except KeyError:
|
||||
print("FYI, there is no PLC Type stored in the database, defaulting to CLX")
|
||||
configProperties['plc_type'] = "CLX"
|
||||
|
||||
try:
|
||||
configProperties['scan_rate'] = int(configObj['scan_rate'])
|
||||
print("FYI, using Scan Rate from the database {0}".format(configProperties['scan_rate']))
|
||||
except KeyError:
|
||||
print("FYI, there is no Scan Rate stored in the database, defaulting to 10 seconds")
|
||||
configProperties['scan_rate'] = 10
|
||||
|
||||
try:
|
||||
sa_test = str(configObj['save_all'])
|
||||
if sa_test.lower() == "true":
|
||||
configProperties['save_all'] = True
|
||||
elif sa_test.lower() == "false":
|
||||
configProperties['save_all'] = False
|
||||
else:
|
||||
configProperties['save_all'] = "test"
|
||||
print("FYI, value for save_all is {0}".format(configProperties['save_all']))
|
||||
except KeyError:
|
||||
print("FYI, there is no save_all value stored in the database, using 'test'")
|
||||
configProperties['save_all'] = 'test'
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
tag_store = {}
|
||||
|
||||
if len(tags) > 0:
|
||||
for t in tags:
|
||||
# (1, u'Pump Intake Pressure', u'5', u'Pump_Intake_Pressure', u'Pressure at the Intake of the Pump', None, 100.0, 3600, u'PSI', 0.0, 3000.0, u'2016-04-13 21:27:01', 0)
|
||||
tag_store[t[1]] = Tag(t[1], t[3], t[0], t[5], t[6], t[7], mapFn=t[8], device_type=configProperties['plc_type'], ip_address=configProperties['PLC_IP_ADDRESS'])
|
||||
|
||||
|
||||
while True:
|
||||
for tag in tag_store:
|
||||
try:
|
||||
tag_store[tag].read(configProperties['save_all'])
|
||||
except:
|
||||
print("ERROR EVALUATING {}".format(tag))
|
||||
traceback.print_exc()
|
||||
time.sleep(configProperties['scan_rate'])
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
23
web_db/Dockerfile.rpi
Normal file
@@ -0,0 +1,23 @@
|
||||
FROM patrickjmcd/rpi-python3:latest
|
||||
|
||||
RUN apt-get -y update
|
||||
COPY mysql-install.sh /tmp/mysql-install.sh
|
||||
RUN chmod +x /tmp/mysql-install.sh && /tmp/mysql-install.sh
|
||||
|
||||
RUN mkdir /root/tag-logger
|
||||
COPY flask /root/tag-logger/flask
|
||||
|
||||
COPY mysql-connector-python-2.1.4 /tmp/mysql
|
||||
RUN cd /tmp/mysql && python setup.py install && cd ~
|
||||
|
||||
COPY startup.sh /root/startup.sh
|
||||
RUN chmod +x /root/startup.sh
|
||||
|
||||
RUN pip install flask flask-restless flask-sqlalchemy pyopenssl
|
||||
|
||||
RUN apt-get clean
|
||||
RUN rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
|
||||
|
||||
RUN service mysql restart && python /root/tag-logger/flask/setupdb.py
|
||||
|
||||
CMD '/root/startup.sh'
|
||||
23
web_db/Dockerfile.ubuntu
Normal file
@@ -0,0 +1,23 @@
|
||||
FROM python:latest
|
||||
|
||||
RUN apt-get -y update
|
||||
COPY mysql-install.sh /tmp/mysql-install.sh
|
||||
RUN chmod +x /tmp/mysql-install.sh && /tmp/mysql-install.sh
|
||||
|
||||
RUN mkdir /root/tag-logger
|
||||
COPY flask /root/tag-logger/flask
|
||||
|
||||
COPY mysql-connector-python-2.1.4 /tmp/mysql
|
||||
RUN cd /tmp/mysql && python setup.py install && cd ~
|
||||
|
||||
COPY startup.sh /root/startup.sh
|
||||
RUN chmod +x /root/startup.sh
|
||||
|
||||
RUN pip install flask flask-restless flask-sqlalchemy pyopenssl
|
||||
|
||||
RUN apt-get clean
|
||||
RUN rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
|
||||
|
||||
RUN service mysql restart && python /root/tag-logger/flask/setupdb.py
|
||||
|
||||
CMD '/root/startup.sh'
|
||||
123
web_db/flask/app/__init__.py
Normal file
@@ -0,0 +1,123 @@
|
||||
# project/__init__.py
|
||||
|
||||
import os
|
||||
from flask import Flask, render_template, request, session, send_from_directory, jsonify, url_for, flash, redirect, Response
|
||||
from flask_sqlalchemy import SQLAlchemy
|
||||
from werkzeug.utils import secure_filename
|
||||
from sqlalchemy import and_
|
||||
import mysql.connector
|
||||
|
||||
UPLOAD_FOLDER = '/root/tag-server/flask/app/docs'
|
||||
ALLOWED_EXTENSIONS = set(['txt', 'pdf', 'png', 'jpg', 'jpeg', 'gif'])
|
||||
|
||||
app = Flask('app', static_url_path='')
|
||||
app.config.update(
|
||||
DEBUG=True,
|
||||
SQLALCHEMY_DATABASE_URI='mysql+mysqlconnector://website:henrypump@127.0.0.1/poconsole'
|
||||
# SQLALCHEMY_DATABASE_URI='sqlite:///../database.db',
|
||||
)
|
||||
app.config['UPLOAD_FOLDER'] = UPLOAD_FOLDER
|
||||
app.config['MAX_CONTENT_LENGTH'] = 16 * 1024 * 1024
|
||||
app.secret_key = 'henry_pump'
|
||||
db = SQLAlchemy(app)
|
||||
|
||||
def allowed_file(filename):
|
||||
return '.' in filename and \
|
||||
filename.rsplit('.', 1)[1] in ALLOWED_EXTENSIONS
|
||||
|
||||
|
||||
@app.route('/', defaults={'path': ''})
|
||||
@app.route('/<path:path>')
|
||||
def catch_all(path):
|
||||
return app.send_static_file('index.html')
|
||||
|
||||
from .datalogger import datalogger
|
||||
from .datalogger.models import *
|
||||
|
||||
@app.route('/api/latest')
|
||||
def get_latest_tag_vals():
|
||||
res = db.engine.execute('SELECT v1.id as id, v1.created_on as dtime, t.id as t_id, t.name as name, t.tag as tag, v1.value as value, t.units as units, t.description as description, t.min_expected as min_expected, t.max_expected as max_expected FROM tag_vals v1 INNER JOIN tags t ON t.id = v1.tag_id WHERE v1.id = (SELECT v2.id FROM tag_vals v2 WHERE v2.tag_id = v1.tag_id ORDER BY v2.id DESC LIMIT 1) ORDER BY t.id')
|
||||
lat = res.fetchall()
|
||||
latest_tags = list(map(latest_to_obj, lat))
|
||||
return jsonify(latest_tags)
|
||||
|
||||
|
||||
@app.route('/api/valuesbetween/<string:ids>/<string:start>/<string:end>')
|
||||
def get_tag_vals_between(ids, start, end):
|
||||
ids = ids.split(',')
|
||||
res = Tag_val.query.filter(and_(Tag_val.tag_id.in_(ids), Tag_val.created_on > start, Tag_val.created_on <= end)).all()
|
||||
return jsonify([i.serialize for i in res])
|
||||
|
||||
|
||||
@app.route('/api/multipletags/<string:ids>')
|
||||
def get_multiple_tags(ids):
|
||||
ids = ids.split(',')
|
||||
res = Tag.query.filter(Tag.id.in_(ids)).all()
|
||||
return jsonify([i.serialize for i in res])
|
||||
|
||||
@app.route('/doc/upload', methods=['POST'])
|
||||
def upload_file():
|
||||
# check if the post request has the file part
|
||||
if 'file' not in request.files:
|
||||
flash('No file part')
|
||||
return redirect("/#/docs")
|
||||
file = request.files['file']
|
||||
# if user does not select file, browser also
|
||||
# submit a empty part without filename
|
||||
if file.filename == '':
|
||||
flash('No selected file')
|
||||
return redirect("/#/docs")
|
||||
if file and allowed_file(file.filename):
|
||||
filename = secure_filename(file.filename)
|
||||
file.save(os.path.join(app.config['UPLOAD_FOLDER'], filename))
|
||||
d = Doc(name=filename)
|
||||
db.session.add(d)
|
||||
db.session.commit()
|
||||
return redirect(url_for('uploaded_file',
|
||||
filename=filename))
|
||||
return redirect("/#/docs")
|
||||
|
||||
@app.route('/docs/<filename>')
|
||||
def uploaded_file(filename):
|
||||
return send_from_directory(app.config['UPLOAD_FOLDER'],
|
||||
filename)
|
||||
|
||||
@app.route('/csv/all')
|
||||
def get_csv_all():
|
||||
csv_string = "datetime,"
|
||||
all_tags = [i.serialize for i in Tag.query.all()]
|
||||
all_tag_names = [x['name'] for x in all_tags]
|
||||
for x in all_tag_names:
|
||||
csv_string += "{},".format(x)
|
||||
csv_string += "\n"
|
||||
|
||||
all_vals = [i.serialize for i in Tag_val.query.all()]
|
||||
val_objs = [{'value': x['value'], 'tag_name': x['tag']['name'], 'datetime': x['created_on']} for x in all_vals]
|
||||
for v in val_objs:
|
||||
tag_ind = all_tag_names.index(v['tag_name'])
|
||||
csv_string += "{},".format(v['datetime']) + "," * tag_ind + "{},".format(v['value']) + "," * (len(all_tag_names) - tag_ind) + "\n"
|
||||
return Response(
|
||||
csv_string,
|
||||
mimetype="text/csv",
|
||||
headers={"Content-disposition":
|
||||
"attachment; filename=datadump.csv"})
|
||||
|
||||
@app.route('/csv/<string:ids>')
|
||||
def get_csv_selected(ids):
|
||||
csv_string = "datetime,"
|
||||
all_tags = [i.serialize for i in Tag.query.filter(Tag.id.in_(ids)).all()]
|
||||
all_tag_names = [x['name'] for x in all_tags]
|
||||
for x in all_tag_names:
|
||||
csv_string += "{},".format(x)
|
||||
csv_string += "\n"
|
||||
|
||||
all_vals = [i.serialize for i in Tag_val.query.filter(Tag_val.tag_id.in_(ids)).all()]
|
||||
val_objs = [{'value': x['value'], 'tag_name': x['tag']['name'], 'datetime': x['created_on']} for x in all_vals]
|
||||
for v in val_objs:
|
||||
tag_ind = all_tag_names.index(v['tag_name'])
|
||||
csv_string += "{},".format(v['datetime']) + "," * tag_ind + "{},".format(v['value']) + "," * (len(all_tag_names) - tag_ind) + "\n"
|
||||
return Response(
|
||||
csv_string,
|
||||
mimetype="text/csv",
|
||||
headers={"Content-disposition":
|
||||
"attachment; filename=datadump{}.csv".format(ids.replace(",","-"))})
|
||||
1
web_db/flask/app/datalogger/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
pass
|
||||
15
web_db/flask/app/datalogger/datalogger.py
Normal file
@@ -0,0 +1,15 @@
|
||||
|
||||
from flask_restless import APIManager
|
||||
|
||||
from .models import *
|
||||
from .. import app
|
||||
|
||||
manager = APIManager(app, flask_sqlalchemy_db=db)
|
||||
manager.create_api(Config, methods=['GET', 'POST', 'DELETE', 'PUT'])
|
||||
manager.create_api(Data_type, methods=['GET', 'POST', 'DELETE', 'PUT'])
|
||||
manager.create_api(Device_type, methods=['GET', 'POST', 'DELETE', 'PUT'])
|
||||
manager.create_api(Device, methods=['GET', 'POST', 'DELETE', 'PUT'])
|
||||
manager.create_api(Doc, methods=['GET', 'POST', 'DELETE', 'PUT'])
|
||||
manager.create_api(Tag_class, methods=['GET', 'POST', 'DELETE', 'PUT'])
|
||||
manager.create_api(Tag, methods=['GET', 'POST', 'DELETE', 'PUT'])
|
||||
manager.create_api(Tag_val, methods=['GET', 'POST', 'DELETE'], allow_delete_many=True)
|
||||
203
web_db/flask/app/datalogger/models.py
Normal file
@@ -0,0 +1,203 @@
|
||||
from datetime import datetime
|
||||
import json
|
||||
|
||||
from .. import db
|
||||
|
||||
|
||||
class Config(db.Model):
|
||||
__tablename__ = "configs"
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
parameter = db.Column(db.String(100), unique=True)
|
||||
val = db.Column(db.String(100), unique=True)
|
||||
created_on = db.Column(db.DateTime(), default=datetime.utcnow)
|
||||
updated_on = db.Column(db.DateTime(), default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
@property
|
||||
def serialize(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"parameter": self.parameter,
|
||||
"val": self.val,
|
||||
"created_on": self.created_on,
|
||||
"updated_on": self.updated_on,
|
||||
}
|
||||
|
||||
|
||||
class Data_type(db.Model):
|
||||
__tablename__ = "data_types"
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
data_type = db.Column(db.String(32), unique=True)
|
||||
plc_type = db.Column(db.String(32))
|
||||
created_on = db.Column(db.DateTime(), default=datetime.utcnow)
|
||||
updated_on = db.Column(db.DateTime(), default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
@property
|
||||
def serialize(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"data_type": self.data_type,
|
||||
"plc_type": self.plc_type,
|
||||
"created_on": self.created_on,
|
||||
"updated_on": self.updated_on,
|
||||
}
|
||||
|
||||
class Device_type(db.Model):
|
||||
__tablename__ = "device_types"
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
device_type = db.Column(db.String(64))
|
||||
created_on = db.Column(db.DateTime(), default=datetime.utcnow)
|
||||
updated_on = db.Column(db.DateTime(), default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
@property
|
||||
def serialize(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"device_type": self.device_type,
|
||||
"created_on": self.created_on,
|
||||
"updated_on": self.updated_on,
|
||||
}
|
||||
|
||||
|
||||
class Device(db.Model):
|
||||
__tablename__ = "devices"
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
device_type_id = db.Column(db.Integer, db.ForeignKey('device_types.id'))
|
||||
device_type = db.relationship(Device_type, primaryjoin=device_type_id==Device_type.id)
|
||||
address = db.Column(db.String(256))
|
||||
created_on = db.Column(db.DateTime(), default=datetime.utcnow)
|
||||
updated_on = db.Column(db.DateTime(), default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
@property
|
||||
def serialize(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"device_type_id": self.device_type_id,
|
||||
"device_type": self.device_type.serialize,
|
||||
"address": self.address,
|
||||
"created_on": self.created_on,
|
||||
"updated_on": self.updated_on,
|
||||
}
|
||||
|
||||
|
||||
class Doc(db.Model):
|
||||
__tablename__ = "docs"
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
name = db.Column(db.String(256))
|
||||
created_on = db.Column(db.DateTime(), default=datetime.utcnow)
|
||||
updated_on = db.Column(db.DateTime(), default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
@property
|
||||
def serialize(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"name": self.name,
|
||||
"location": self.location,
|
||||
"description": self.description,
|
||||
"created_on": self.created_on,
|
||||
"updated_on": self.updated_on,
|
||||
}
|
||||
|
||||
class Tag_class(db.Model):
|
||||
__tablename__ = "tag_classes"
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
class_type = db.Column(db.String(64), unique=True)
|
||||
description = db.Column(db.String(128))
|
||||
created_on = db.Column(db.DateTime(), default=datetime.utcnow)
|
||||
updated_on = db.Column(db.DateTime(), default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
def toJSON(self):
|
||||
return json.dumps({'id': self.id, 'class_type': self.class_type, 'description': self.description, 'created_on': self.created_on, 'updated_on': self.updated_on})
|
||||
|
||||
@property
|
||||
def serialize(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"class_type": self.class_type,
|
||||
"description": self.description,
|
||||
"created_on": self.created_on,
|
||||
"updated_on": self.updated_on,
|
||||
}
|
||||
|
||||
|
||||
class Tag(db.Model):
|
||||
__tablename__ = "tags"
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
name = db.Column(db.String(64))
|
||||
tag_class_id = db.Column(db.Integer, db.ForeignKey('tag_classes.id'))
|
||||
tag_class = db.relationship(Tag_class)
|
||||
tag = db.Column(db.String(128))
|
||||
device_id = db.Column(db.Integer, db.ForeignKey('devices.id'))
|
||||
device = db.relationship(Device)
|
||||
description = db.Column(db.String(64))
|
||||
data_type_id = db.Column(db.Integer, db.ForeignKey('data_types.id'))
|
||||
data_type = db.relationship(Data_type)
|
||||
change_threshold = db.Column(db.Float)
|
||||
guarantee_sec = db.Column(db.Integer)
|
||||
map_function = db.Column(db.String(64))
|
||||
units = db.Column(db.String(10))
|
||||
min_expected = db.Column(db.Float)
|
||||
max_expected = db.Column(db.Float)
|
||||
created_on = db.Column(db.DateTime(), default=datetime.utcnow)
|
||||
updated_on = db.Column(db.DateTime(), default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
|
||||
@property
|
||||
def serialize(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"name": self.name,
|
||||
"tag_class_id": self.tag_class_id,
|
||||
"tag_class": self.tag_class.serialize,
|
||||
"tag": self.tag,
|
||||
"device_id": self.device_id,
|
||||
"device": self.device.serialize,
|
||||
"description": self.description,
|
||||
"data_type_id": self.data_type_id,
|
||||
"data_type": self.data_type.serialize,
|
||||
"change_threshold": self.change_threshold,
|
||||
"guarantee_sec": self.guarantee_sec,
|
||||
"map_function": self.map_function,
|
||||
"units": self.units,
|
||||
"min_expected": self.min_expected,
|
||||
"max_expected": self.max_expected,
|
||||
"created_on": self.created_on,
|
||||
"updated_on": self.updated_on
|
||||
}
|
||||
|
||||
def latest_to_obj(tup):
|
||||
ob = {}
|
||||
ob['id'] = tup[0]
|
||||
ob['datetime'] = str(tup[1])
|
||||
ob['tag_id'] = str(tup[2])
|
||||
ob['tag_name'] = str(tup[3])
|
||||
ob['tag'] = str(tup[4])
|
||||
ob['value'] = tup[5]
|
||||
ob['units'] = tup[6]
|
||||
ob['tag_description'] = str(tup[7])
|
||||
ob['min_expected'] = tup[8]
|
||||
ob['max_expected'] = tup[9]
|
||||
return ob
|
||||
|
||||
|
||||
class Tag_val(db.Model):
|
||||
__tablename__ = "tag_vals"
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
tag_id = db.Column(db.Integer, db.ForeignKey('tags.id'))
|
||||
tag = db.relationship(Tag)
|
||||
value = db.Column(db.Float)
|
||||
created_on = db.Column(db.DateTime(), default=datetime.utcnow)
|
||||
updated_on = db.Column(db.DateTime(), default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||
|
||||
@property
|
||||
def serialize(self):
|
||||
return {
|
||||
"id": self.id,
|
||||
"tag_id": self.tag_id,
|
||||
"tag": self.tag.serialize,
|
||||
"value": self.value,
|
||||
"created_on": self.created_on,
|
||||
"updated_on": self.updated_on
|
||||
}
|
||||
|
||||
|
||||
|
||||
BIN
web_db/flask/app/docs/LL_Cool_J_by_Cambria_Harkey_12263.jpg
Normal file
|
After Width: | Height: | Size: 4.2 MiB |
1
web_db/flask/app/static/css/LineChart.min.css
vendored
Normal file
@@ -0,0 +1 @@
|
||||
.chart-legend{margin-left:20px}.chart-legend .item{cursor:pointer;font-family:sans-serif;height:16px;font-size:.8em;font-weight:100;display:inline-block;margin-right:10px}.chart-legend .item>*{vertical-align:middle;display:inline-block}.chart-legend .item>.legend-label{height:16px;line-height:17px}.chart-legend .item>.icon{width:16px;border-radius:50%;height:16px;margin-right:5px;background-repeat:no-repeat;background-position:50% 25%}.chart-legend .item.legend-hidden{opacity:.4}.chart-legend .item.column>.icon{background-image:url("data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' style='fill: white;' width='10' height='10'><rect y='2' width='2' height='8'/><rect x='4' y='5' width='2' height='5'/><rect x='8' y='3' width='2' height='7'/></svg>")}.chart-legend .item.dot>.icon{background-image:url("data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' style='fill: white;' width='10' height='10'><circle cx='5' cy='6' r='2'/></svg>")}.chart-legend .item.dashed-line>.icon{background-image:url("data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' style='fill: white;' width='10' height='10'>\a <g style=\"stroke: white; fill: none; stroke-dasharray: 4px,2px;\">\a <path d='M0,6 L10,6'/>\a </g>\a </svg>")}.chart-legend .item.line>.icon{background-image:url("data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' style='fill: white;' width='10' height='10'>\a <g style=\"stroke: white;\">\a <path d='M0,6 L10,6'/>\a </g>\a </svg>")}.chart-legend .item.area>.icon{background-image:url("data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' style='fill: white;' width='10' height='10'><polygon points='0,10 2.428,3 5,6 7.625,5 10,10 10,10 0,10'/></svg>")}.tooltip-line{stroke:grey;stroke-width:1;shape-rendering:crispEdges}.tooltip-dot{stroke-width:2px;fill:white}.chart-tooltip{position:absolute;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;background-color:white;z-index:100;box-shadow:1px 1px 2px rgba(61,61,61,0.5);padding:5px 10px;border-radius:1px;font-family:sans-serif;font-weight:100}.chart-tooltip>.abscissas{margin-bottom:5px;font-size:.7em;white-space:nowrap}.chart-tooltip .tooltip-item{font-size:.8em;white-space:nowrap}.chart-tooltip .tooltip-item:not(:last-child){margin-bottom:.2em}.chart-tooltip .tooltip-item>*{display:inline-block}.chart-tooltip .tooltip-item>*:not(:last-child){margin-right:.4em}.chart-tooltip .tooltip-item .color-dot{width:10px;height:10px;border-radius:50%}.chart-tooltip .tooltip-item .y-value{font-weight:500}.chart{position:relative;box-sizing:border-box;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none}.chart .axis{font:10px Roboto;shape-rendering:crispEdges}.chart .axis.x2-axis{display:none}.chart .axis>path{fill:none;stroke:black}.chart .axis>.tick>text{fill:black}.chart .axis>.tick>line{stroke:black}.chart .grid .tick>text{display:none}.chart .grid .tick>line{stroke:#eee;stroke-width:1;shape-rendering:crispEdges}.chart .dot-series circle{fill:white;stroke-width:2px}.chart .line-series path{stroke-width:1px}.chart .column-series{fill-opacity:.3}.chart .area-series{opacity:.3}.chart .chart-brush{fill:rgba(166,166,166,0.5)}.chart .hline{shape-rendering:crispEdges;stroke-width:1px}
|
||||
4
web_db/flask/app/static/css/font-awesome.min.css
vendored
Normal file
104
web_db/flask/app/static/css/ng-quick-date-default-theme.css
Normal file
@@ -0,0 +1,104 @@
|
||||
.quickdate {
|
||||
display: inline-block;
|
||||
vertical-align: bottom;
|
||||
font-size: 15px;
|
||||
font-family: 'Helvetica Neue', Helvetica, Arial, sans-serif;
|
||||
}
|
||||
.quickdate input,
|
||||
.quickdate select {
|
||||
font-size: 13px;
|
||||
}
|
||||
.quickdate-button {
|
||||
background: #ffffff;
|
||||
color: #333333;
|
||||
border: solid 1px #cccccc;
|
||||
box-shadow: outset 0 1px 1px rgba(0, 0, 0, 0.075);
|
||||
border-radius: 4px;
|
||||
padding: 4px 8px;
|
||||
display: inline-block;
|
||||
text-decoration: none;
|
||||
}
|
||||
.quickdate-button:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
.quickdate-button:hover i {
|
||||
text-decoration: none;
|
||||
}
|
||||
.quickdate-button i {
|
||||
padding-right: 4px;
|
||||
}
|
||||
.quickdate-popup {
|
||||
color: #333333;
|
||||
font-size: 15px;
|
||||
background-color: #fafafa;
|
||||
border: solid 1px #dddddd;
|
||||
border-radius: 3px;
|
||||
-webkit-box-shadow: 0px 10px 30px rgba(25, 25, 25, 0.92);
|
||||
-moz-box-shadow: 0px 10px 30px rgba(25, 25, 25, 0.92);
|
||||
box-shadow: 0px 10px 30px rgba(25, 25, 25, 0.92);
|
||||
}
|
||||
.quickdate-action-link:visited,
|
||||
.quickdate-action-link:hover {
|
||||
color: #333333;
|
||||
}
|
||||
.quickdate-next-month i {
|
||||
padding-left: 10px;
|
||||
}
|
||||
.quickdate-prev-month i {
|
||||
padding-right: 10px;
|
||||
}
|
||||
table.quickdate-calendar {
|
||||
border: solid 1px #ccc;
|
||||
background-color: #ffffff;
|
||||
}
|
||||
table.quickdate-calendar th,
|
||||
table.quickdate-calendar td {
|
||||
border-right: 1px solid #ccc;
|
||||
border-bottom: 1px solid #ccc;
|
||||
}
|
||||
table.quickdate-calendar td:hover {
|
||||
background-color: #e6e6e6;
|
||||
}
|
||||
table.quickdate-calendar td.other-month {
|
||||
background-color: #dbdbdb;
|
||||
color: #808080;
|
||||
}
|
||||
table.quickdate-calendar td.other-month:hover {
|
||||
background-color: #c7c7c7;
|
||||
}
|
||||
table.quickdate-calendar td.disabled-date {
|
||||
background-color: inherit;
|
||||
color: #ffffff;
|
||||
}
|
||||
table.quickdate-calendar td.disabled-date:hover {
|
||||
background-color: inherit;
|
||||
cursor: default;
|
||||
}
|
||||
table.quickdate-calendar td.selected {
|
||||
background-color: #b0ccde;
|
||||
font-weight: bold;
|
||||
}
|
||||
table.quickdate-calendar td.is-today {
|
||||
color: #b58922;
|
||||
font-weight: bold;
|
||||
}
|
||||
table.quickdate-calendar td.is-today.disabled-date {
|
||||
color: #929292;
|
||||
font-weight: normal;
|
||||
}
|
||||
.quickdate-popup-footer {
|
||||
margin: 3px 1px 0;
|
||||
}
|
||||
.quickdate-clear {
|
||||
display: inline-block;
|
||||
padding: 2px 4px;
|
||||
background-color: #ffffff;
|
||||
color: #333333;
|
||||
border: solid 1px #cccccc;
|
||||
box-shadow: outset 0 1px 1px rgba(0, 0, 0, 0.075);
|
||||
border-radius: 4px;
|
||||
text-decoration: none;
|
||||
}
|
||||
.quickdate-clear:hover {
|
||||
background-color: #f2f2f2;
|
||||
}
|
||||
90
web_db/flask/app/static/css/ng-quick-date.css
Normal file
@@ -0,0 +1,90 @@
|
||||
.quickdate {
|
||||
display: inline-block;
|
||||
position: relative;
|
||||
}
|
||||
.quickdate-button div,
|
||||
.quickdate-action-link div {
|
||||
display: inline;
|
||||
}
|
||||
.quickdate-popup {
|
||||
z-index: 10;
|
||||
background-color: #fff;
|
||||
border: solid 1px #000;
|
||||
text-align: center;
|
||||
width: 250px;
|
||||
display: none;
|
||||
position: absolute;
|
||||
padding: 5px;
|
||||
}
|
||||
.quickdate-popup.open {
|
||||
display: block;
|
||||
}
|
||||
.quickdate-close {
|
||||
position: absolute;
|
||||
top: 5px;
|
||||
right: 5px;
|
||||
color: #333;
|
||||
font-size: 110%;
|
||||
margin-top: -6px;
|
||||
text-decoration: none;
|
||||
}
|
||||
.quickdate-close:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
.quickdate-close:hover,
|
||||
.quickdate-close:visited {
|
||||
color: #333;
|
||||
}
|
||||
.quickdate-calendar-header {
|
||||
display: block;
|
||||
padding: 2px 0;
|
||||
margin-bottom: 5px;
|
||||
text-align: center;
|
||||
}
|
||||
.quickdate-month {
|
||||
display: inline-block;
|
||||
}
|
||||
a.quickdate-prev-month {
|
||||
float: left;
|
||||
}
|
||||
a.quickdate-next-month {
|
||||
float: right;
|
||||
}
|
||||
.quickdate-text-inputs {
|
||||
text-align: left;
|
||||
margin-bottom: 5px;
|
||||
}
|
||||
.quickdate-input-wrapper {
|
||||
width: 48%;
|
||||
display: inline-block;
|
||||
}
|
||||
input.quickdate-date-input,
|
||||
input.quickdate-time-input {
|
||||
width: 100px;
|
||||
margin: 0;
|
||||
height: auto;
|
||||
padding: 2px 3px;
|
||||
}
|
||||
table.quickdate-calendar {
|
||||
border-collapse: collapse;
|
||||
border-spacing: 0;
|
||||
width: 100%;
|
||||
margin-top: 5px;
|
||||
}
|
||||
table.quickdate-calendar th,
|
||||
table.quickdate-calendar td {
|
||||
padding: 5px;
|
||||
}
|
||||
table.quickdate-calendar td:hover {
|
||||
cursor: pointer;
|
||||
}
|
||||
.quickdate-popup-footer {
|
||||
text-align: right;
|
||||
display: block;
|
||||
}
|
||||
.quickdate input.ng-invalid {
|
||||
border: 1px solid #dd3b30;
|
||||
}
|
||||
.quickdate input.ng-invalid:focus {
|
||||
outline-color: #dd3b30;
|
||||
}
|
||||
BIN
web_db/flask/app/static/fonts/FontAwesome.otf
Normal file
BIN
web_db/flask/app/static/fonts/fontawesome-webfont.eot
Normal file
@@ -169,7 +169,7 @@
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M1408 608v-320q0 -119 -84.5 -203.5t-203.5 -84.5h-832q-119 0 -203.5 84.5t-84.5 203.5v832q0 119 84.5 203.5t203.5 84.5h704q14 0 23 -9t9 -23v-64q0 -14 -9 -23t-23 -9h-704q-66 0 -113 -47t-47 -113v-832q0 -66 47 -113t113 -47h832q66 0 113 47t47 113v320 q0 14 9 23t23 9h64q14 0 23 -9t9 -23zM1792 1472v-512q0 -26 -19 -45t-45 -19t-45 19l-176 176l-652 -652q-10 -10 -23 -10t-23 10l-114 114q-10 10 -10 23t10 23l652 652l-176 176q-19 19 -19 45t19 45t45 19h512q26 0 45 -19t19 -45z" />
|
||||
<glyph unicode="" d="M1184 640q0 -26 -19 -45l-544 -544q-19 -19 -45 -19t-45 19t-19 45v288h-448q-26 0 -45 19t-19 45v384q0 26 19 45t45 19h448v288q0 26 19 45t45 19t45 -19l544 -544q19 -19 19 -45zM1536 992v-704q0 -119 -84.5 -203.5t-203.5 -84.5h-320q-13 0 -22.5 9.5t-9.5 22.5 q0 4 -1 20t-0.5 26.5t3 23.5t10 19.5t20.5 6.5h320q66 0 113 47t47 113v704q0 66 -47 113t-113 47h-288h-11h-13t-11.5 1t-11.5 3t-8 5.5t-7 9t-2 13.5q0 4 -1 20t-0.5 26.5t3 23.5t10 19.5t20.5 6.5h320q119 0 203.5 -84.5t84.5 -203.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1664" d="M458 653q-74 162 -74 371h-256v-96q0 -78 94.5 -162t235.5 -113zM1536 928v96h-256q0 -209 -74 -371q141 29 235.5 113t94.5 162zM1664 1056v-128q0 -71 -41.5 -143t-112 -130t-173 -97.5t-215.5 -44.5q-42 -54 -95 -95q-38 -34 -52.5 -72.5t-14.5 -89.5q0 -54 30.5 -91 t97.5 -37q75 0 133.5 -45.5t58.5 -114.5v-64q0 -14 -9 -23t-23 -9h-832q-14 0 -23 9t-9 23v64q0 69 58.5 114.5t133.5 45.5q67 0 97.5 37t30.5 91q0 51 -14.5 89.5t-52.5 72.5q-53 41 -95 95q-113 5 -215.5 44.5t-173 97.5t-112 130t-41.5 143v128q0 40 28 68t68 28h288v96 q0 66 47 113t113 47h576q66 0 113 -47t47 -113v-96h288q40 0 68 -28t28 -68z" />
|
||||
<glyph unicode="" d="M519 336q4 6 -3 13q-9 7 -14 2q-4 -6 3 -13q9 -7 14 -2zM491 377q-5 7 -12 4q-6 -4 0 -12q7 -8 12 -5q6 4 0 13zM450 417q2 4 -5 8q-7 2 -8 -2q-3 -5 4 -8q8 -2 9 2zM471 394q2 1 1.5 4.5t-3.5 5.5q-6 7 -10 3t1 -11q6 -6 11 -2zM557 319q2 7 -9 11q-9 3 -13 -4 q-2 -7 9 -11q9 -3 13 4zM599 316q0 8 -12 8q-10 0 -10 -8t11 -8t11 8zM638 323q-2 7 -13 5t-9 -9q2 -8 12 -6t10 10zM1280 640q0 212 -150 362t-362 150t-362 -150t-150 -362q0 -167 98 -300.5t252 -185.5q18 -3 26.5 5t8.5 20q0 52 -1 95q-6 -1 -15.5 -2.5t-35.5 -2t-48 4 t-43.5 20t-29.5 41.5q-23 59 -57 74q-2 1 -4.5 3.5l-8 8t-7 9.5t4 7.5t19.5 3.5q6 0 15 -2t30 -15.5t33 -35.5q16 -28 37.5 -42t43.5 -14t38 3.5t30 9.5q7 47 33 69q-49 6 -86 18.5t-73 39t-55.5 76t-19.5 119.5q0 79 53 137q-24 62 5 136q19 6 54.5 -7.5t60.5 -29.5l26 -16 q58 17 128 17t128 -17q11 7 28.5 18t55.5 26t57 9q29 -74 5 -136q53 -58 53 -137q0 -57 -14 -100.5t-35.5 -70t-53.5 -44.5t-62.5 -26t-68.5 -12q35 -31 35 -95q0 -40 -0.5 -89t-0.5 -51q0 -12 8.5 -20t26.5 -5q154 52 252 185.5t98 300.5zM1536 1120v-960 q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
|
||||
<glyph unicode="" d="M394 184q-8 -9 -20 3q-13 11 -4 19q8 9 20 -3q12 -11 4 -19zM352 245q9 -12 0 -19q-8 -6 -17 7t0 18q9 7 17 -6zM291 305q-5 -7 -13 -2q-10 5 -7 12q3 5 13 2q10 -5 7 -12zM322 271q-6 -7 -16 3q-9 11 -2 16q6 6 16 -3q9 -11 2 -16zM451 159q-4 -12 -19 -6q-17 4 -13 15 t19 7q16 -5 13 -16zM514 154q0 -11 -16 -11q-17 -2 -17 11q0 11 16 11q17 2 17 -11zM572 164q2 -10 -14 -14t-18 8t14 15q16 2 18 -9zM1536 1120v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-224q-16 0 -24.5 1t-19.5 5t-16 14.5t-5 27.5v239q0 97 -52 142q57 6 102.5 18t94 39 t81 66.5t53 105t20.5 150.5q0 121 -79 206q37 91 -8 204q-28 9 -81 -11t-92 -44l-38 -24q-93 26 -192 26t-192 -26q-16 11 -42.5 27t-83.5 38.5t-86 13.5q-44 -113 -7 -204q-79 -85 -79 -206q0 -85 20.5 -150t52.5 -105t80.5 -67t94 -39t102.5 -18q-40 -36 -49 -103 q-21 -10 -45 -15t-57 -5t-65.5 21.5t-55.5 62.5q-19 32 -48.5 52t-49.5 24l-20 3q-21 0 -29 -4.5t-5 -11.5t9 -14t13 -12l7 -5q22 -10 43.5 -38t31.5 -51l10 -23q13 -38 44 -61.5t67 -30t69.5 -7t55.5 3.5l23 4q0 -38 0.5 -103t0.5 -68q0 -22 -11 -33.5t-22 -13t-33 -1.5 h-224q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1664" d="M1280 64q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1536 64q0 26 -19 45t-45 19t-45 -19t-19 -45t19 -45t45 -19t45 19t19 45zM1664 288v-320q0 -40 -28 -68t-68 -28h-1472q-40 0 -68 28t-28 68v320q0 40 28 68t68 28h427q21 -56 70.5 -92 t110.5 -36h256q61 0 110.5 36t70.5 92h427q40 0 68 -28t28 -68zM1339 936q-17 -40 -59 -40h-256v-448q0 -26 -19 -45t-45 -19h-256q-26 0 -45 19t-19 45v448h-256q-42 0 -59 40q-17 39 14 69l448 448q18 19 45 19t45 -19l448 -448q31 -30 14 -69z" />
|
||||
<glyph unicode="" d="M1407 710q0 44 -7 113.5t-18 96.5q-12 30 -17 44t-9 36.5t-4 48.5q0 23 5 68.5t5 67.5q0 37 -10 55q-4 1 -13 1q-19 0 -58 -4.5t-59 -4.5q-60 0 -176 24t-175 24q-43 0 -94.5 -11.5t-85 -23.5t-89.5 -34q-137 -54 -202 -103q-96 -73 -159.5 -189.5t-88 -236t-24.5 -248.5 q0 -40 12.5 -120t12.5 -121q0 -23 -11 -66.5t-11 -65.5t12 -36.5t34 -14.5q24 0 72.5 11t73.5 11q57 0 169.5 -15.5t169.5 -15.5q181 0 284 36q129 45 235.5 152.5t166 245.5t59.5 275zM1535 712q0 -165 -70 -327.5t-196 -288t-281 -180.5q-124 -44 -326 -44 q-57 0 -170 14.5t-169 14.5q-24 0 -72.5 -14.5t-73.5 -14.5q-73 0 -123.5 55.5t-50.5 128.5q0 24 11 68t11 67q0 40 -12.5 120.5t-12.5 121.5q0 111 18 217.5t54.5 209.5t100.5 194t150 156q78 59 232 120q194 78 316 78q60 0 175.5 -24t173.5 -24q19 0 57 5t58 5 q81 0 118 -50.5t37 -134.5q0 -23 -5 -68t-5 -68q0 -10 1 -18.5t3 -17t4 -13.5t6.5 -16t6.5 -17q16 -40 25 -118.5t9 -136.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1408" d="M1408 296q0 -27 -10 -70.5t-21 -68.5q-21 -50 -122 -106q-94 -51 -186 -51q-27 0 -52.5 3.5t-57.5 12.5t-47.5 14.5t-55.5 20.5t-49 18q-98 35 -175 83q-128 79 -264.5 215.5t-215.5 264.5q-48 77 -83 175q-3 9 -18 49t-20.5 55.5t-14.5 47.5t-12.5 57.5t-3.5 52.5 q0 92 51 186q56 101 106 122q25 11 68.5 21t70.5 10q14 0 21 -3q18 -6 53 -76q11 -19 30 -54t35 -63.5t31 -53.5q3 -4 17.5 -25t21.5 -35.5t7 -28.5q0 -20 -28.5 -50t-62 -55t-62 -53t-28.5 -46q0 -9 5 -22.5t8.5 -20.5t14 -24t11.5 -19q76 -137 174 -235t235 -174 q2 -1 19 -11.5t24 -14t20.5 -8.5t22.5 -5q18 0 46 28.5t53 62t55 62t50 28.5q14 0 28.5 -7t35.5 -21.5t25 -17.5q25 -15 53.5 -31t63.5 -35t54 -30q70 -35 76 -53q3 -7 3 -21z" />
|
||||
@@ -178,7 +178,7 @@
|
||||
<glyph unicode="" d="M1280 343q0 11 -2 16q-3 8 -38.5 29.5t-88.5 49.5l-53 29q-5 3 -19 13t-25 15t-21 5q-18 0 -47 -32.5t-57 -65.5t-44 -33q-7 0 -16.5 3.5t-15.5 6.5t-17 9.5t-14 8.5q-99 55 -170.5 126.5t-126.5 170.5q-2 3 -8.5 14t-9.5 17t-6.5 15.5t-3.5 16.5q0 13 20.5 33.5t45 38.5 t45 39.5t20.5 36.5q0 10 -5 21t-15 25t-13 19q-3 6 -15 28.5t-25 45.5t-26.5 47.5t-25 40.5t-16.5 18t-16 2q-48 0 -101 -22q-46 -21 -80 -94.5t-34 -130.5q0 -16 2.5 -34t5 -30.5t9 -33t10 -29.5t12.5 -33t11 -30q60 -164 216.5 -320.5t320.5 -216.5q6 -2 30 -11t33 -12.5 t29.5 -10t33 -9t30.5 -5t34 -2.5q57 0 130.5 34t94.5 80q22 53 22 101zM1536 1120v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1664" d="M1620 1128q-67 -98 -162 -167q1 -14 1 -42q0 -130 -38 -259.5t-115.5 -248.5t-184.5 -210.5t-258 -146t-323 -54.5q-271 0 -496 145q35 -4 78 -4q225 0 401 138q-105 2 -188 64.5t-114 159.5q33 -5 61 -5q43 0 85 11q-112 23 -185.5 111.5t-73.5 205.5v4q68 -38 146 -41 q-66 44 -105 115t-39 154q0 88 44 163q121 -149 294.5 -238.5t371.5 -99.5q-8 38 -8 74q0 134 94.5 228.5t228.5 94.5q140 0 236 -102q109 21 205 78q-37 -115 -142 -178q93 10 186 50z" />
|
||||
<glyph unicode="" horiz-adv-x="1024" d="M959 1524v-264h-157q-86 0 -116 -36t-30 -108v-189h293l-39 -296h-254v-759h-306v759h-255v296h255v218q0 186 104 288.5t277 102.5q147 0 228 -12z" />
|
||||
<glyph unicode="" d="M768 1408q209 0 385.5 -103t279.5 -279.5t103 -385.5q0 -251 -146.5 -451.5t-378.5 -277.5q-27 -5 -40 7t-13 30q0 3 0.5 76.5t0.5 134.5q0 97 -52 142q57 6 102.5 18t94 39t81 66.5t53 105t20.5 150.5q0 119 -79 206q37 91 -8 204q-28 9 -81 -11t-92 -44l-38 -24 q-93 26 -192 26t-192 -26q-16 11 -42.5 27t-83.5 38.5t-85 13.5q-45 -113 -8 -204q-79 -87 -79 -206q0 -85 20.5 -150t52.5 -105t80.5 -67t94 -39t102.5 -18q-39 -36 -49 -103q-21 -10 -45 -15t-57 -5t-65.5 21.5t-55.5 62.5q-19 32 -48.5 52t-49.5 24l-20 3q-21 0 -29 -4.5 t-5 -11.5t9 -14t13 -12l7 -5q22 -10 43.5 -38t31.5 -51l10 -23q13 -38 44 -61.5t67 -30t69.5 -7t55.5 3.5l23 4q0 -38 0.5 -88.5t0.5 -54.5q0 -18 -13 -30t-40 -7q-232 77 -378.5 277.5t-146.5 451.5q0 209 103 385.5t279.5 279.5t385.5 103zM291 305q3 7 -7 12 q-10 3 -13 -2q-3 -7 7 -12q9 -6 13 2zM322 271q7 5 -2 16q-10 9 -16 3q-7 -5 2 -16q10 -10 16 -3zM352 226q9 7 0 19q-8 13 -17 6q-9 -5 0 -18t17 -7zM394 184q8 8 -4 19q-12 12 -20 3q-9 -8 4 -19q12 -12 20 -3zM451 159q3 11 -13 16q-15 4 -19 -7t13 -15q15 -6 19 6z M514 154q0 13 -17 11q-16 0 -16 -11q0 -13 17 -11q16 0 16 11zM572 164q-2 11 -18 9q-16 -3 -14 -15t18 -8t14 14z" />
|
||||
<glyph unicode="" d="M1536 640q0 -251 -146.5 -451.5t-378.5 -277.5q-27 -5 -39.5 7t-12.5 30v211q0 97 -52 142q57 6 102.5 18t94 39t81 66.5t53 105t20.5 150.5q0 121 -79 206q37 91 -8 204q-28 9 -81 -11t-92 -44l-38 -24q-93 26 -192 26t-192 -26q-16 11 -42.5 27t-83.5 38.5t-86 13.5 q-44 -113 -7 -204q-79 -85 -79 -206q0 -85 20.5 -150t52.5 -105t80.5 -67t94 -39t102.5 -18q-40 -36 -49 -103q-21 -10 -45 -15t-57 -5t-65.5 21.5t-55.5 62.5q-19 32 -48.5 52t-49.5 24l-20 3q-21 0 -29 -4.5t-5 -11.5t9 -14t13 -12l7 -5q22 -10 43.5 -38t31.5 -51l10 -23 q13 -38 44 -61.5t67 -30t69.5 -7t55.5 3.5l23 4q0 -38 0.5 -89t0.5 -54q0 -18 -13 -30t-40 -7q-232 77 -378.5 277.5t-146.5 451.5q0 209 103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1664" d="M1664 960v-256q0 -26 -19 -45t-45 -19h-64q-26 0 -45 19t-19 45v256q0 106 -75 181t-181 75t-181 -75t-75 -181v-192h96q40 0 68 -28t28 -68v-576q0 -40 -28 -68t-68 -28h-960q-40 0 -68 28t-28 68v576q0 40 28 68t68 28h672v192q0 185 131.5 316.5t316.5 131.5 t316.5 -131.5t131.5 -316.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1920" d="M1760 1408q66 0 113 -47t47 -113v-1216q0 -66 -47 -113t-113 -47h-1600q-66 0 -113 47t-47 113v1216q0 66 47 113t113 47h1600zM160 1280q-13 0 -22.5 -9.5t-9.5 -22.5v-224h1664v224q0 13 -9.5 22.5t-22.5 9.5h-1600zM1760 0q13 0 22.5 9.5t9.5 22.5v608h-1664v-608 q0 -13 9.5 -22.5t22.5 -9.5h1600zM256 128v128h256v-128h-256zM640 128v128h384v-128h-384z" />
|
||||
<glyph unicode="" horiz-adv-x="1408" d="M384 192q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM896 69q2 -28 -17 -48q-18 -21 -47 -21h-135q-25 0 -43 16.5t-20 41.5q-22 229 -184.5 391.5t-391.5 184.5q-25 2 -41.5 20t-16.5 43v135q0 29 21 47q17 17 43 17h5q160 -13 306 -80.5 t259 -181.5q114 -113 181.5 -259t80.5 -306zM1408 67q2 -27 -18 -47q-18 -20 -46 -20h-143q-26 0 -44.5 17.5t-19.5 42.5q-12 215 -101 408.5t-231.5 336t-336 231.5t-408.5 102q-25 1 -42.5 19.5t-17.5 43.5v143q0 28 20 46q18 18 44 18h3q262 -13 501.5 -120t425.5 -294 q187 -186 294 -425.5t120 -501.5z" />
|
||||
@@ -484,7 +484,7 @@
|
||||
<glyph unicode="" horiz-adv-x="2048" d="M1024 13q-20 0 -93 73.5t-73 93.5q0 32 62.5 54t103.5 22t103.5 -22t62.5 -54q0 -20 -73 -93.5t-93 -73.5zM1294 284q-2 0 -40 25t-101.5 50t-128.5 25t-128.5 -25t-101 -50t-40.5 -25q-18 0 -93.5 75t-75.5 93q0 13 10 23q78 77 196 121t233 44t233 -44t196 -121 q10 -10 10 -23q0 -18 -75.5 -93t-93.5 -75zM1567 556q-11 0 -23 8q-136 105 -252 154.5t-268 49.5q-85 0 -170.5 -22t-149 -53t-113.5 -62t-79 -53t-31 -22q-17 0 -92 75t-75 93q0 12 10 22q132 132 320 205t380 73t380 -73t320 -205q10 -10 10 -22q0 -18 -75 -93t-92 -75z M1838 827q-11 0 -22 9q-179 157 -371.5 236.5t-420.5 79.5t-420.5 -79.5t-371.5 -236.5q-11 -9 -22 -9q-17 0 -92.5 75t-75.5 93q0 13 10 23q187 186 445 288t527 102t527 -102t445 -288q10 -10 10 -23q0 -18 -75.5 -93t-92.5 -75z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M384 0q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM768 0q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM384 384q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5 t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1152 0q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM768 384q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5 t37.5 90.5zM384 768q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1152 384q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM768 768q0 53 -37.5 90.5t-90.5 37.5 t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1536 0v384q0 52 -38 90t-90 38t-90 -38t-38 -90v-384q0 -52 38 -90t90 -38t90 38t38 90zM1152 768q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5z M1536 1088v256q0 26 -19 45t-45 19h-1280q-26 0 -45 -19t-19 -45v-256q0 -26 19 -45t45 -19h1280q26 0 45 19t19 45zM1536 768q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5zM1664 1408v-1536q0 -52 -38 -90t-90 -38 h-1408q-52 0 -90 38t-38 90v1536q0 52 38 90t90 38h1408q52 0 90 -38t38 -90z" />
|
||||
<glyph unicode="" d="M1519 890q18 -84 -4 -204q-87 -444 -565 -444h-44q-25 0 -44 -16.5t-24 -42.5l-4 -19l-55 -346l-2 -15q-5 -26 -24.5 -42.5t-44.5 -16.5h-251q-21 0 -33 15t-9 36q9 56 26.5 168t26.5 168t27 167.5t27 167.5q5 37 43 37h131q133 -2 236 21q175 39 287 144q102 95 155 246 q24 70 35 133q1 6 2.5 7.5t3.5 1t6 -3.5q79 -59 98 -162zM1347 1172q0 -107 -46 -236q-80 -233 -302 -315q-113 -40 -252 -42q0 -1 -90 -1l-90 1q-100 0 -118 -96q-2 -8 -85 -530q-1 -10 -12 -10h-295q-22 0 -36.5 16.5t-11.5 38.5l232 1471q5 29 27.5 48t51.5 19h598 q34 0 97.5 -13t111.5 -32q107 -41 163.5 -123t56.5 -196z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M441 864q32 0 52 -26q266 -364 362 -774h-446q-127 441 -367 749q-12 16 -3 33.5t29 17.5h373zM1000 507q-49 -199 -125 -393q-79 310 -256 594q40 221 44 449q211 -340 337 -650zM1099 1216q235 -324 384.5 -698.5t184.5 -773.5h-451q-41 665 -553 1472h435zM1792 640 q0 -424 -101 -812q-67 560 -359 1083q-25 301 -106 584q-4 16 5.5 28.5t25.5 12.5h359q21 0 38.5 -13t22.5 -33q115 -409 115 -850z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M602 949q19 -61 31 -123.5t17 -141.5t-14 -159t-62 -145q-21 81 -67 157t-95.5 127t-99 90.5t-78.5 57.5t-33 19q-62 34 -81.5 100t14.5 128t101 81.5t129 -14.5q138 -83 238 -177zM927 1236q11 -25 20.5 -46t36.5 -100.5t42.5 -150.5t25.5 -179.5t0 -205.5t-47.5 -209.5 t-105.5 -208.5q-51 -72 -138 -72q-54 0 -98 31q-57 40 -69 109t28 127q60 85 81 195t13 199.5t-32 180.5t-39 128t-22 52q-31 63 -8.5 129.5t85.5 97.5q34 17 75 17q47 0 88.5 -25t63.5 -69zM1248 567q-17 -160 -72 -311q-17 131 -63 246q25 174 -5 361q-27 178 -94 342 q114 -90 212 -211q9 -37 15 -80q26 -179 7 -347zM1520 1440q9 -17 23.5 -49.5t43.5 -117.5t50.5 -178t34 -227.5t5 -269t-47 -300t-112.5 -323.5q-22 -48 -66 -75.5t-95 -27.5q-39 0 -74 16q-67 31 -92.5 100t4.5 136q58 126 90 257.5t37.5 239.5t-3.5 213.5t-26.5 180.5 t-38.5 138.5t-32.5 90t-15.5 32.5q-34 65 -11.5 135.5t87.5 104.5q37 20 81 20q49 0 91.5 -25.5t66.5 -70.5z" />
|
||||
<glyph unicode="" horiz-adv-x="2304" d="M1975 546h-138q14 37 66 179l3 9q4 10 10 26t9 26l12 -55zM531 611l-58 295q-11 54 -75 54h-268l-2 -13q311 -79 403 -336zM710 960l-162 -438l-17 89q-26 70 -85 129.5t-131 88.5l135 -510h175l261 641h-176zM849 318h166l104 642h-166zM1617 944q-69 27 -149 27 q-123 0 -201 -59t-79 -153q-1 -102 145 -174q48 -23 67 -41t19 -39q0 -30 -30 -46t-69 -16q-86 0 -156 33l-22 11l-23 -144q74 -34 185 -34q130 -1 208.5 59t80.5 160q0 106 -140 174q-49 25 -71 42t-22 38q0 22 24.5 38.5t70.5 16.5q70 1 124 -24l15 -8zM2042 960h-128 q-65 0 -87 -54l-246 -588h174l35 96h212q5 -22 20 -96h154zM2304 1280v-1280q0 -52 -38 -90t-90 -38h-2048q-52 0 -90 38t-38 90v1280q0 52 38 90t90 38h2048q52 0 90 -38t38 -90z" />
|
||||
<glyph unicode="" horiz-adv-x="2304" d="M671 603h-13q-47 0 -47 -32q0 -22 20 -22q17 0 28 15t12 39zM1066 639h62v3q1 4 0.5 6.5t-1 7t-2 8t-4.5 6.5t-7.5 5t-11.5 2q-28 0 -36 -38zM1606 603h-12q-48 0 -48 -32q0 -22 20 -22q17 0 28 15t12 39zM1925 629q0 41 -30 41q-19 0 -31 -20t-12 -51q0 -42 28 -42 q20 0 32.5 20t12.5 52zM480 770h87l-44 -262h-56l32 201l-71 -201h-39l-4 200l-34 -200h-53l44 262h81l2 -163zM733 663q0 -6 -4 -42q-16 -101 -17 -113h-47l1 22q-20 -26 -58 -26q-23 0 -37.5 16t-14.5 42q0 39 26 60.5t73 21.5q14 0 23 -1q0 3 0.5 5.5t1 4.5t0.5 3 q0 20 -36 20q-29 0 -59 -10q0 4 7 48q38 11 67 11q74 0 74 -62zM889 721l-8 -49q-22 3 -41 3q-27 0 -27 -17q0 -8 4.5 -12t21.5 -11q40 -19 40 -60q0 -72 -87 -71q-34 0 -58 6q0 2 7 49q29 -8 51 -8q32 0 32 19q0 7 -4.5 11.5t-21.5 12.5q-43 20 -43 59q0 72 84 72 q30 0 50 -4zM977 721h28l-7 -52h-29q-2 -17 -6.5 -40.5t-7 -38.5t-2.5 -18q0 -16 19 -16q8 0 16 2l-8 -47q-21 -7 -40 -7q-43 0 -45 47q0 12 8 56q3 20 25 146h55zM1180 648q0 -23 -7 -52h-111q-3 -22 10 -33t38 -11q30 0 58 14l-9 -54q-30 -8 -57 -8q-95 0 -95 95 q0 55 27.5 90.5t69.5 35.5q35 0 55.5 -21t20.5 -56zM1319 722q-13 -23 -22 -62q-22 2 -31 -24t-25 -128h-56l3 14q22 130 29 199h51l-3 -33q14 21 25.5 29.5t28.5 4.5zM1506 763l-9 -57q-28 14 -50 14q-31 0 -51 -27.5t-20 -70.5q0 -30 13.5 -47t38.5 -17q21 0 48 13 l-10 -59q-28 -8 -50 -8q-45 0 -71.5 30.5t-26.5 82.5q0 70 35.5 114.5t91.5 44.5q26 0 61 -13zM1668 663q0 -18 -4 -42q-13 -79 -17 -113h-46l1 22q-20 -26 -59 -26q-23 0 -37 16t-14 42q0 39 25.5 60.5t72.5 21.5q15 0 23 -1q2 7 2 13q0 20 -36 20q-29 0 -59 -10q0 4 8 48 q38 11 67 11q73 0 73 -62zM1809 722q-14 -24 -21 -62q-23 2 -31.5 -23t-25.5 -129h-56l3 14q19 104 29 199h52q0 -11 -4 -33q15 21 26.5 29.5t27.5 4.5zM1950 770h56l-43 -262h-53l3 19q-23 -23 -52 -23q-31 0 -49.5 24t-18.5 64q0 53 27.5 92t64.5 39q31 0 53 -29z M2061 640q0 148 -72.5 273t-198 198t-273.5 73q-181 0 -328 -110q127 -116 171 -284h-50q-44 150 -158 253q-114 -103 -158 -253h-50q44 168 171 284q-147 110 -328 110q-148 0 -273.5 -73t-198 -198t-72.5 -273t72.5 -273t198 -198t273.5 -73q181 0 328 110 q-120 111 -165 264h50q46 -138 152 -233q106 95 152 233h50q-45 -153 -165 -264q147 -110 328 -110q148 0 273.5 73t198 198t72.5 273zM2304 1280v-1280q0 -52 -38 -90t-90 -38h-2048q-52 0 -90 38t-38 90v1280q0 52 38 90t90 38h2048q52 0 90 -38t38 -90z" />
|
||||
<glyph unicode="" horiz-adv-x="2304" d="M313 759q0 -51 -36 -84q-29 -26 -89 -26h-17v220h17q61 0 89 -27q36 -31 36 -83zM2089 824q0 -52 -64 -52h-19v101h20q63 0 63 -49zM380 759q0 74 -50 120.5t-129 46.5h-95v-333h95q74 0 119 38q60 51 60 128zM410 593h65v333h-65v-333zM730 694q0 40 -20.5 62t-75.5 42 q-29 10 -39.5 19t-10.5 23q0 16 13.5 26.5t34.5 10.5q29 0 53 -27l34 44q-41 37 -98 37q-44 0 -74 -27.5t-30 -67.5q0 -35 18 -55.5t64 -36.5q37 -13 45 -19q19 -12 19 -34q0 -20 -14 -33.5t-36 -13.5q-48 0 -71 44l-42 -40q44 -64 115 -64q51 0 83 30.5t32 79.5zM1008 604 v77q-37 -37 -78 -37q-49 0 -80.5 32.5t-31.5 82.5q0 48 31.5 81.5t77.5 33.5q43 0 81 -38v77q-40 20 -80 20q-74 0 -125.5 -50.5t-51.5 -123.5t51 -123.5t125 -50.5q42 0 81 19zM2240 0v527q-65 -40 -144.5 -84t-237.5 -117t-329.5 -137.5t-417.5 -134.5t-504 -118h1569 q26 0 45 19t19 45zM1389 757q0 75 -53 128t-128 53t-128 -53t-53 -128t53 -128t128 -53t128 53t53 128zM1541 584l144 342h-71l-90 -224l-89 224h-71l142 -342h35zM1714 593h184v56h-119v90h115v56h-115v74h119v57h-184v-333zM2105 593h80l-105 140q76 16 76 94q0 47 -31 73 t-87 26h-97v-333h65v133h9zM2304 1274v-1268q0 -56 -38.5 -95t-93.5 -39h-2040q-55 0 -93.5 39t-38.5 95v1268q0 56 38.5 95t93.5 39h2040q55 0 93.5 -39t38.5 -95z" />
|
||||
@@ -641,45 +641,15 @@
|
||||
<glyph unicode="" d="M841 483l148 -148l-149 -149zM840 1094l149 -149l-148 -148zM710 -130l464 464l-306 306l306 306l-464 464v-611l-255 255l-93 -93l320 -321l-320 -321l93 -93l255 255v-611zM1429 640q0 -209 -32 -365.5t-87.5 -257t-140.5 -162.5t-181.5 -86.5t-219.5 -24.5 t-219.5 24.5t-181.5 86.5t-140.5 162.5t-87.5 257t-32 365.5t32 365.5t87.5 257t140.5 162.5t181.5 86.5t219.5 24.5t219.5 -24.5t181.5 -86.5t140.5 -162.5t87.5 -257t32 -365.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1024" d="M596 113l173 172l-173 172v-344zM596 823l173 172l-173 172v-344zM628 640l356 -356l-539 -540v711l-297 -296l-108 108l372 373l-372 373l108 108l297 -296v711l539 -540z" />
|
||||
<glyph unicode="" d="M1280 256q0 52 -38 90t-90 38t-90 -38t-38 -90t38 -90t90 -38t90 38t38 90zM512 1024q0 52 -38 90t-90 38t-90 -38t-38 -90t38 -90t90 -38t90 38t38 90zM1536 256q0 -159 -112.5 -271.5t-271.5 -112.5t-271.5 112.5t-112.5 271.5t112.5 271.5t271.5 112.5t271.5 -112.5 t112.5 -271.5zM1440 1344q0 -20 -13 -38l-1056 -1408q-19 -26 -51 -26h-160q-26 0 -45 19t-19 45q0 20 13 38l1056 1408q19 26 51 26h160q26 0 45 -19t19 -45zM768 1024q0 -159 -112.5 -271.5t-271.5 -112.5t-271.5 112.5t-112.5 271.5t112.5 271.5t271.5 112.5 t271.5 -112.5t112.5 -271.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M104 830l792 -1015l-868 630q-18 13 -25 34.5t0 42.5l101 308v0zM566 830h660l-330 -1015v0zM368 1442l198 -612h-462l198 612q8 23 33 23t33 -23zM1688 830l101 -308q7 -21 0 -42.5t-25 -34.5l-868 -630l792 1015v0zM1688 830h-462l198 612q8 23 33 23t33 -23z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M384 704h160v224h-160v-224zM1221 372v92q-104 -36 -243 -38q-135 -1 -259.5 46.5t-220.5 122.5l1 -96q88 -80 212 -128.5t272 -47.5q129 0 238 49zM640 704h640v224h-640v-224zM1792 736q0 -187 -99 -352q89 -102 89 -229q0 -157 -129.5 -268t-313.5 -111 q-122 0 -225 52.5t-161 140.5q-19 -1 -57 -1t-57 1q-58 -88 -161 -140.5t-225 -52.5q-184 0 -313.5 111t-129.5 268q0 127 89 229q-99 165 -99 352q0 209 120 385.5t326.5 279.5t449.5 103t449.5 -103t326.5 -279.5t120 -385.5z" />
|
||||
<glyph unicode="" d="M515 625v-128h-252v128h252zM515 880v-127h-252v127h252zM1273 369v-128h-341v128h341zM1273 625v-128h-672v128h672zM1273 880v-127h-672v127h672zM1408 20v1240q0 8 -6 14t-14 6h-32l-378 -256l-210 171l-210 -171l-378 256h-32q-8 0 -14 -6t-6 -14v-1240q0 -8 6 -14 t14 -6h1240q8 0 14 6t6 14zM553 1130l185 150h-406zM983 1130l221 150h-406zM1536 1260v-1240q0 -62 -43 -105t-105 -43h-1240q-62 0 -105 43t-43 105v1240q0 62 43 105t105 43h1240q62 0 105 -43t43 -105z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M896 720q-104 196 -160 278q-139 202 -347 318q-34 19 -70 36q-89 40 -94 32t34 -38l39 -31q62 -43 112.5 -93.5t94.5 -116.5t70.5 -113t70.5 -131q9 -17 13 -25q44 -84 84 -153t98 -154t115.5 -150t131 -123.5t148.5 -90.5q153 -66 154 -60q1 3 -49 37q-53 36 -81 57 q-77 58 -179 211t-185 310zM549 177q-76 60 -132.5 125t-98 143.5t-71 154.5t-58.5 186t-52 209t-60.5 252t-76.5 289q273 0 497.5 -36t379 -92t271 -144.5t185.5 -172.5t110 -198.5t56 -199.5t12.5 -198.5t-9.5 -173t-20 -143.5t-13 -107l323 -327h-104l-281 285 q-22 -2 -91.5 -14t-121.5 -19t-138 -6t-160.5 17t-167.5 59t-179 111z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M1374 879q-6 26 -28.5 39.5t-48.5 7.5q-261 -62 -401 -62t-401 62q-26 6 -48.5 -7.5t-28.5 -39.5t7.5 -48.5t39.5 -28.5q194 -46 303 -58q-2 -158 -15.5 -269t-26.5 -155.5t-41 -115.5l-9 -21q-10 -25 1 -49t36 -34q9 -4 23 -4q44 0 60 41l8 20q54 139 71 259h42 q17 -120 71 -259l8 -20q16 -41 60 -41q14 0 23 4q25 10 36 34t1 49l-9 21q-28 71 -41 115.5t-26.5 155.5t-15.5 269q109 12 303 58q26 6 39.5 28.5t7.5 48.5zM1024 1024q0 53 -37.5 90.5t-90.5 37.5t-90.5 -37.5t-37.5 -90.5t37.5 -90.5t90.5 -37.5t90.5 37.5t37.5 90.5z M1600 640q0 -143 -55.5 -273.5t-150 -225t-225 -150t-273.5 -55.5t-273.5 55.5t-225 150t-150 225t-55.5 273.5t55.5 273.5t150 225t225 150t273.5 55.5t273.5 -55.5t225 -150t150 -225t55.5 -273.5zM896 1408q-156 0 -298 -61t-245 -164t-164 -245t-61 -298t61 -298 t164 -245t245 -164t298 -61t298 61t245 164t164 245t61 298t-61 298t-164 245t-245 164t-298 61zM1792 640q0 -182 -71 -348t-191 -286t-286 -191t-348 -71t-348 71t-286 191t-191 286t-71 348t71 348t191 286t286 191t348 71t348 -71t286 -191t191 -286t71 -348z" />
|
||||
<glyph unicode="" d="M1438 723q34 -35 29 -82l-44 -551q-4 -42 -34.5 -70t-71.5 -28q-6 0 -9 1q-44 3 -72.5 36.5t-25.5 77.5l35 429l-143 -8q55 -113 55 -240q0 -216 -148 -372l-137 137q91 101 91 235q0 145 -102.5 248t-247.5 103q-134 0 -236 -92l-137 138q120 114 284 141l264 300 l-149 87l-181 -161q-33 -30 -77 -27.5t-73 35.5t-26.5 77t34.5 73l239 213q26 23 60 26.5t64 -14.5l488 -283q36 -21 48 -68q17 -67 -26 -117l-205 -232l371 20q49 3 83 -32zM1240 1180q-74 0 -126 52t-52 126t52 126t126 52t126.5 -52t52.5 -126t-52.5 -126t-126.5 -52z M613 -62q106 0 196 61l139 -139q-146 -116 -335 -116q-148 0 -273.5 73t-198.5 198t-73 273q0 188 116 336l139 -139q-60 -88 -60 -197q0 -145 102.5 -247.5t247.5 -102.5z" />
|
||||
<glyph unicode="" d="M880 336v-160q0 -14 -9 -23t-23 -9h-160q-14 0 -23 9t-9 23v160q0 14 9 23t23 9h160q14 0 23 -9t9 -23zM1136 832q0 -50 -15 -90t-45.5 -69t-52 -44t-59.5 -36q-32 -18 -46.5 -28t-26 -24t-11.5 -29v-32q0 -14 -9 -23t-23 -9h-160q-14 0 -23 9t-9 23v68q0 35 10.5 64.5 t24 47.5t39 35.5t41 25.5t44.5 21q53 25 75 43t22 49q0 42 -43.5 71.5t-95.5 29.5q-56 0 -95 -27q-29 -20 -80 -83q-9 -12 -25 -12q-11 0 -19 6l-108 82q-10 7 -12 20t5 23q122 192 349 192q129 0 238.5 -89.5t109.5 -214.5zM768 1280q-130 0 -248.5 -51t-204 -136.5 t-136.5 -204t-51 -248.5t51 -248.5t136.5 -204t204 -136.5t248.5 -51t248.5 51t204 136.5t136.5 204t51 248.5t-51 248.5t-136.5 204t-204 136.5t-248.5 51zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5 t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5t103 -385.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1408" d="M366 1225q-64 0 -110 45.5t-46 110.5q0 64 46 109.5t110 45.5t109.5 -45.5t45.5 -109.5q0 -65 -45.5 -110.5t-109.5 -45.5zM917 583q0 -50 -30 -67.5t-63.5 -6.5t-47.5 34l-367 438q-7 12 -14 15.5t-11 1.5l-3 -3q-7 -8 4 -21l122 -139l1 -354l-161 -457 q-67 -192 -92 -234q-16 -26 -28 -32q-50 -26 -103 -1q-29 13 -41.5 43t-9.5 57q2 17 197 618l5 416l-85 -164l35 -222q4 -24 -1 -42t-14 -27.5t-19 -16t-17 -7.5l-7 -2q-19 -3 -34.5 3t-24 16t-14 22t-7.5 19.5t-2 9.5l-46 299l211 381q23 34 113 34q75 0 107 -40l424 -521 q7 -5 14 -17l3 -3l-1 -1q7 -13 7 -29zM514 433q43 -113 88.5 -225t69.5 -168l24 -55q36 -93 42 -125q11 -70 -36 -97q-35 -22 -66 -16t-51 22t-29 35h-1q-6 16 -8 25l-124 351zM1338 -159q31 -49 31 -57q0 -5 -3 -7q-9 -5 -14.5 0.5t-15.5 26t-16 30.5q-114 172 -423 661 q3 -1 7 1t7 4l3 2q11 9 11 17z" />
|
||||
<glyph unicode="" horiz-adv-x="2304" d="M504 542h171l-1 265zM1530 641q0 87 -50.5 140t-146.5 53h-54v-388h52q91 0 145 57t54 138zM956 1018l1 -756q0 -14 -9.5 -24t-23.5 -10h-216q-14 0 -23.5 10t-9.5 24v62h-291l-55 -81q-10 -15 -28 -15h-267q-21 0 -30.5 18t3.5 35l556 757q9 14 27 14h332q14 0 24 -10 t10 -24zM1783 641q0 -193 -125.5 -303t-324.5 -110h-270q-14 0 -24 10t-10 24v756q0 14 10 24t24 10h268q200 0 326 -109t126 -302zM1939 640q0 -11 -0.5 -29t-8 -71.5t-21.5 -102t-44.5 -108t-73.5 -102.5h-51q38 45 66.5 104.5t41.5 112t21 98t9 72.5l1 27q0 8 -0.5 22.5 t-7.5 60t-20 91.5t-41 111.5t-66 124.5h43q41 -47 72 -107t45.5 -111.5t23 -96t10.5 -70.5zM2123 640q0 -11 -0.5 -29t-8 -71.5t-21.5 -102t-45 -108t-74 -102.5h-51q38 45 66.5 104.5t41.5 112t21 98t9 72.5l1 27q0 8 -0.5 22.5t-7.5 60t-19.5 91.5t-40.5 111.5t-66 124.5 h43q41 -47 72 -107t45.5 -111.5t23 -96t10.5 -70.5zM2304 640q0 -11 -0.5 -29t-8 -71.5t-21.5 -102t-44.5 -108t-73.5 -102.5h-51q38 45 66 104.5t41 112t21 98t9 72.5l1 27q0 8 -0.5 22.5t-7.5 60t-19.5 91.5t-40.5 111.5t-66 124.5h43q41 -47 72 -107t45.5 -111.5t23 -96 t9.5 -70.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1408" d="M617 -153q0 11 -13 58t-31 107t-20 69q-1 4 -5 26.5t-8.5 36t-13.5 21.5q-15 14 -51 14q-23 0 -70 -5.5t-71 -5.5q-34 0 -47 11q-6 5 -11 15.5t-7.5 20t-6.5 24t-5 18.5q-37 128 -37 255t37 255q1 4 5 18.5t6.5 24t7.5 20t11 15.5q13 11 47 11q24 0 71 -5.5t70 -5.5 q36 0 51 14q9 8 13.5 21.5t8.5 36t5 26.5q2 9 20 69t31 107t13 58q0 22 -43.5 52.5t-75.5 42.5q-20 8 -45 8q-34 0 -98 -18q-57 -17 -96.5 -40.5t-71 -66t-46 -70t-45.5 -94.5q-6 -12 -9 -19q-49 -107 -68 -216t-19 -244t19 -244t68 -216q56 -122 83 -161q63 -91 179 -127 l6 -2q64 -18 98 -18q25 0 45 8q32 12 75.5 42.5t43.5 52.5zM776 760q-26 0 -45 19t-19 45.5t19 45.5q37 37 37 90q0 52 -37 91q-19 19 -19 45t19 45t45 19t45 -19q75 -75 75 -181t-75 -181q-21 -19 -45 -19zM957 579q-27 0 -45 19q-19 19 -19 45t19 45q112 114 112 272 t-112 272q-19 19 -19 45t19 45t45 19t45 -19q150 -150 150 -362t-150 -362q-18 -19 -45 -19zM1138 398q-27 0 -45 19q-19 19 -19 45t19 45q90 91 138.5 208t48.5 245t-48.5 245t-138.5 208q-19 19 -19 45t19 45t45 19t45 -19q109 -109 167 -249t58 -294t-58 -294t-167 -249 q-18 -19 -45 -19z" />
|
||||
<glyph unicode="" horiz-adv-x="2176" d="M192 352q-66 0 -113 -47t-47 -113t47 -113t113 -47t113 47t47 113t-47 113t-113 47zM704 352q-66 0 -113 -47t-47 -113t47 -113t113 -47t113 47t47 113t-47 113t-113 47zM704 864q-66 0 -113 -47t-47 -113t47 -113t113 -47t113 47t47 113t-47 113t-113 47zM1472 352 q-66 0 -113 -47t-47 -113t47 -113t113 -47t113 47t47 113t-47 113t-113 47zM1984 352q-66 0 -113 -47t-47 -113t47 -113t113 -47t113 47t47 113t-47 113t-113 47zM1472 864q-66 0 -113 -47t-47 -113t47 -113t113 -47t113 47t47 113t-47 113t-113 47zM1984 864 q-66 0 -113 -47t-47 -113t47 -113t113 -47t113 47t47 113t-47 113t-113 47zM1984 1376q-66 0 -113 -47t-47 -113t47 -113t113 -47t113 47t47 113t-47 113t-113 47zM384 192q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM896 192q0 -80 -56 -136 t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM384 704q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM896 704q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM384 1216q0 -80 -56 -136t-136 -56 t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM1664 192q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM896 1216q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM2176 192q0 -80 -56 -136t-136 -56t-136 56 t-56 136t56 136t136 56t136 -56t56 -136zM1664 704q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM2176 704q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136zM1664 1216q0 -80 -56 -136t-136 -56t-136 56t-56 136 t56 136t136 56t136 -56t56 -136zM2176 1216q0 -80 -56 -136t-136 -56t-136 56t-56 136t56 136t136 56t136 -56t56 -136z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M128 -192q0 -26 -19 -45t-45 -19t-45 19t-19 45t19 45t45 19t45 -19t19 -45zM320 0q0 -26 -19 -45t-45 -19t-45 19t-19 45t19 45t45 19t45 -19t19 -45zM365 365l256 -256l-90 -90l-256 256zM704 384q0 -26 -19 -45t-45 -19t-45 19t-19 45t19 45t45 19t45 -19t19 -45z M1411 704q0 -59 -11.5 -108.5t-37.5 -93.5t-44 -67.5t-53 -64.5q-31 -35 -45.5 -54t-33.5 -50t-26.5 -64t-7.5 -74q0 -159 -112.5 -271.5t-271.5 -112.5q-26 0 -45 19t-19 45t19 45t45 19q106 0 181 75t75 181q0 57 11.5 105.5t37 91t43.5 66.5t52 63q40 46 59.5 72 t37.5 74.5t18 103.5q0 185 -131.5 316.5t-316.5 131.5t-316.5 -131.5t-131.5 -316.5q0 -26 -19 -45t-45 -19t-45 19t-19 45q0 117 45.5 223.5t123 184t184 123t223.5 45.5t223.5 -45.5t184 -123t123 -184t45.5 -223.5zM896 576q0 -26 -19 -45t-45 -19t-45 19t-19 45t19 45 t45 19t45 -19t19 -45zM1184 704q0 -26 -19 -45t-45 -19t-45 19t-19 45q0 93 -65.5 158.5t-158.5 65.5q-92 0 -158 -65.5t-66 -158.5q0 -26 -19 -45t-45 -19t-45 19t-19 45q0 146 103 249t249 103t249 -103t103 -249zM1578 993q10 -25 -1 -49t-36 -34q-9 -4 -23 -4 q-19 0 -35.5 11t-23.5 30q-68 178 -224 295q-21 16 -25 42t12 47q17 21 43 25t47 -12q183 -137 266 -351zM1788 1074q9 -25 -1.5 -49t-35.5 -34q-11 -4 -23 -4q-44 0 -60 41q-92 238 -297 393q-22 16 -25.5 42t12.5 47q16 22 42 25.5t47 -12.5q235 -175 341 -449z" />
|
||||
<glyph unicode="" horiz-adv-x="2304" d="M1032 576q-59 2 -84 55q-17 34 -48 53.5t-68 19.5q-53 0 -90.5 -37.5t-37.5 -90.5q0 -56 36 -89l10 -8q34 -31 82 -31q37 0 68 19.5t48 53.5q25 53 84 55zM1600 704q0 56 -36 89l-10 8q-34 31 -82 31q-37 0 -68 -19.5t-48 -53.5q-25 -53 -84 -55q59 -2 84 -55 q17 -34 48 -53.5t68 -19.5q53 0 90.5 37.5t37.5 90.5zM1174 925q-17 -35 -55 -48t-73 4q-62 31 -134 31q-51 0 -99 -17q3 0 9.5 0.5t9.5 0.5q92 0 170.5 -50t118.5 -133q17 -36 3.5 -73.5t-49.5 -54.5q-18 -9 -39 -9q21 0 39 -9q36 -17 49.5 -54.5t-3.5 -73.5 q-40 -83 -118.5 -133t-170.5 -50h-6q-16 2 -44 4l-290 27l-239 -120q-14 -7 -29 -7q-40 0 -57 35l-160 320q-11 23 -4 47.5t29 37.5l209 119l148 267q17 155 91.5 291.5t195.5 236.5q31 25 70.5 21.5t64.5 -34.5t21.5 -70t-34.5 -65q-70 -59 -117 -128q123 84 267 101 q40 5 71.5 -19t35.5 -64q5 -40 -19 -71.5t-64 -35.5q-84 -10 -159 -55q46 10 99 10q115 0 218 -50q36 -18 49 -55.5t-5 -73.5zM2137 1085l160 -320q11 -23 4 -47.5t-29 -37.5l-209 -119l-148 -267q-17 -155 -91.5 -291.5t-195.5 -236.5q-26 -22 -61 -22q-45 0 -74 35 q-25 31 -21.5 70t34.5 65q70 59 117 128q-123 -84 -267 -101q-4 -1 -12 -1q-36 0 -63.5 24t-31.5 60q-5 40 19 71.5t64 35.5q84 10 159 55q-46 -10 -99 -10q-115 0 -218 50q-36 18 -49 55.5t5 73.5q17 35 55 48t73 -4q62 -31 134 -31q51 0 99 17q-3 0 -9.5 -0.5t-9.5 -0.5 q-92 0 -170.5 50t-118.5 133q-17 36 -3.5 73.5t49.5 54.5q18 9 39 9q-21 0 -39 9q-36 17 -49.5 54.5t3.5 73.5q40 83 118.5 133t170.5 50h6h1q14 -2 42 -4l291 -27l239 120q14 7 29 7q40 0 57 -35z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M1056 704q0 -26 19 -45t45 -19t45 19t19 45q0 146 -103 249t-249 103t-249 -103t-103 -249q0 -26 19 -45t45 -19t45 19t19 45q0 93 66 158.5t158 65.5t158 -65.5t66 -158.5zM835 1280q-117 0 -223.5 -45.5t-184 -123t-123 -184t-45.5 -223.5q0 -26 19 -45t45 -19t45 19 t19 45q0 185 131.5 316.5t316.5 131.5t316.5 -131.5t131.5 -316.5q0 -55 -18 -103.5t-37.5 -74.5t-59.5 -72q-34 -39 -52 -63t-43.5 -66.5t-37 -91t-11.5 -105.5q0 -106 -75 -181t-181 -75q-26 0 -45 -19t-19 -45t19 -45t45 -19q159 0 271.5 112.5t112.5 271.5q0 41 7.5 74 t26.5 64t33.5 50t45.5 54q35 41 53 64.5t44 67.5t37.5 93.5t11.5 108.5q0 117 -45.5 223.5t-123 184t-184 123t-223.5 45.5zM591 561l226 -226l-579 -579q-12 -12 -29 -12t-29 12l-168 168q-12 12 -12 29t12 29zM1612 1524l168 -168q12 -12 12 -29t-12 -30l-233 -233 l-26 -25l-71 -71q-66 153 -195 258l91 91l207 207q13 12 30 12t29 -12z" />
|
||||
<glyph unicode="" d="M866 1021q0 -27 -13 -94q-11 -50 -31.5 -150t-30.5 -150q-2 -11 -4.5 -12.5t-13.5 -2.5q-20 -2 -31 -2q-58 0 -84 49.5t-26 113.5q0 88 35 174t103 124q28 14 51 14q28 0 36.5 -16.5t8.5 -47.5zM1352 597q0 14 -39 75.5t-52 66.5q-21 8 -34 8q-91 0 -226 -77l-2 2 q3 22 27.5 135t24.5 178q0 233 -242 233q-24 0 -68 -6q-94 -17 -168.5 -89.5t-111.5 -166.5t-37 -189q0 -146 80.5 -225t227.5 -79q25 0 25 -3t-1 -5q-4 -34 -26 -117q-14 -52 -51.5 -101t-82.5 -49q-42 0 -42 47q0 24 10.5 47.5t25 39.5t29.5 28.5t26 20t11 8.5q0 3 -7 10 q-24 22 -58.5 36.5t-65.5 14.5q-35 0 -63.5 -34t-41 -75t-12.5 -75q0 -88 51.5 -142t138.5 -54q82 0 155 53t117.5 126t65.5 153q6 22 15.5 66.5t14.5 66.5q3 12 14 18q118 60 227 60q48 0 127 -18q1 -1 4 -1q5 0 9.5 4.5t4.5 8.5zM1536 1120v-960q0 -119 -84.5 -203.5 t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5t84.5 -203.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1535" d="M744 1231q0 24 -2 38.5t-8.5 30t-21 23t-37.5 7.5q-39 0 -78 -23q-105 -58 -159 -190.5t-54 -269.5q0 -44 8.5 -85.5t26.5 -80.5t52.5 -62.5t81.5 -23.5q4 0 18 -0.5t20 0t16 3t15 8.5t7 16q16 77 48 231.5t48 231.5q19 91 19 146zM1498 575q0 -7 -7.5 -13.5t-15.5 -6.5 l-6 1q-22 3 -62 11t-72 12.5t-63 4.5q-167 0 -351 -93q-15 -8 -21 -27q-10 -36 -24.5 -105.5t-22.5 -100.5q-23 -91 -70 -179.5t-112.5 -164.5t-154.5 -123t-185 -47q-135 0 -214.5 83.5t-79.5 219.5q0 53 19.5 117t63 116.5t97.5 52.5q38 0 120 -33.5t83 -61.5 q0 -1 -16.5 -12.5t-39.5 -31t-46 -44.5t-39 -61t-16 -74q0 -33 16.5 -53t48.5 -20q45 0 85 31.5t66.5 78t48 105.5t32.5 107t16 90v9q0 2 -3.5 3.5t-8.5 1.5h-10t-10 -0.5t-6 -0.5q-227 0 -352 122.5t-125 348.5q0 108 34.5 221t96 210t156 167.5t204.5 89.5q52 9 106 9 q374 0 374 -360q0 -98 -38 -273t-43 -211l3 -3q101 57 182.5 88t167.5 31q22 0 53 -13q19 -7 80 -102.5t61 -116.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1664" d="M831 863q32 0 59 -18l222 -148q61 -40 110 -97l146 -170q40 -46 29 -106l-72 -413q-6 -32 -29.5 -53.5t-55.5 -25.5l-527 -56l-352 -32h-9q-39 0 -67.5 28t-28.5 68q0 37 27 64t65 32l260 32h-448q-41 0 -69.5 30t-26.5 71q2 39 32 65t69 26l442 1l-521 64q-41 5 -66 37 t-19 73q6 35 34.5 57.5t65.5 22.5h10l481 -60l-351 94q-38 10 -62 41.5t-18 68.5q6 36 33 58.5t62 22.5q6 0 20 -2l448 -96l217 -37q1 0 3 -0.5t3 -0.5q23 0 30.5 23t-12.5 36l-186 125q-35 23 -42 63.5t18 73.5q27 38 76 38zM761 661l186 -125l-218 37l-5 2l-36 38 l-238 262q-1 1 -2.5 3.5t-2.5 3.5q-24 31 -18.5 70t37.5 64q31 23 68 17.5t64 -33.5l142 -147l-4 -4t-5 -4q-32 -45 -23 -99t55 -85zM1648 1115l15 -266q4 -73 -11 -147l-48 -219q-12 -59 -67 -87l-106 -54q2 62 -39 109l-146 170q-53 61 -117 103l-222 148q-34 23 -76 23 q-51 0 -88 -37l-235 312q-25 33 -18 73.5t41 63.5q33 22 71.5 14t62.5 -40l266 -352l-262 455q-21 35 -10.5 75t47.5 59q35 18 72.5 6t57.5 -46l241 -420l-136 337q-15 35 -4.5 74t44.5 56q37 19 76 6t56 -51l193 -415l101 -196q8 -15 23 -17.5t27 7.5t11 26l-12 224 q-2 41 26 71t69 31q39 0 67 -28.5t30 -67.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" d="M335 180q-2 0 -6 2q-86 57 -168.5 145t-139.5 180q-21 30 -21 69q0 9 2 19t4 18t7 18t8.5 16t10.5 17t10 15t12 15.5t11 14.5q184 251 452 365q-110 198 -110 211q0 19 17 29q116 64 128 64q18 0 28 -16l124 -229q92 19 192 19q266 0 497.5 -137.5t378.5 -369.5 q20 -31 20 -69t-20 -69q-91 -142 -218.5 -253.5t-278.5 -175.5q110 -198 110 -211q0 -20 -17 -29q-116 -64 -127 -64q-19 0 -29 16l-124 229l-64 119l-444 820l7 7q-58 -24 -99 -47q3 -5 127 -234t243 -449t119 -223q0 -7 -9 -9q-13 -3 -72 -3q-57 0 -60 7l-456 841 q-39 -28 -82 -68q24 -43 214 -393.5t190 -354.5q0 -10 -11 -10q-14 0 -82.5 22t-72.5 28l-106 197l-224 413q-44 -53 -78 -106q2 -3 18 -25t23 -34l176 -327q0 -10 -10 -10zM1165 282l49 -91q273 111 450 385q-180 277 -459 389q67 -64 103 -148.5t36 -176.5 q0 -106 -47 -200.5t-132 -157.5zM848 896q0 -20 14 -34t34 -14q86 0 147 -61t61 -147q0 -20 14 -34t34 -14t34 14t14 34q0 126 -89 215t-215 89q-20 0 -34 -14t-14 -34zM1214 961l-9 4l7 -7z" />
|
||||
<glyph unicode="" horiz-adv-x="1280" d="M1050 430q0 -215 -147 -374q-148 -161 -378 -161q-232 0 -378 161q-147 159 -147 374q0 147 68 270.5t189 196.5t268 73q96 0 182 -31q-32 -62 -39 -126q-66 28 -143 28q-167 0 -280.5 -123t-113.5 -291q0 -170 112.5 -288.5t281.5 -118.5t281 118.5t112 288.5 q0 89 -32 166q66 13 123 49q41 -98 41 -212zM846 619q0 -192 -79.5 -345t-238.5 -253l-14 -1q-29 0 -62 5q83 32 146.5 102.5t99.5 154.5t58.5 189t30 192.5t7.5 178.5q0 69 -3 103q55 -160 55 -326zM791 947v-2q-73 214 -206 440q88 -59 142.5 -186.5t63.5 -251.5z M1035 744q-83 0 -160 75q218 120 290 247q19 37 21 56q-42 -94 -139.5 -166.5t-204.5 -97.5q-35 54 -35 113q0 37 17 79t43 68q46 44 157 74q59 16 106 58.5t74 100.5q74 -105 74 -253q0 -109 -24 -170q-32 -77 -88.5 -130.5t-130.5 -53.5z" />
|
||||
<glyph unicode="" d="M1050 495q0 78 -28 147q-41 -25 -85 -34q22 -50 22 -114q0 -117 -77 -198.5t-193 -81.5t-193.5 81.5t-77.5 198.5q0 115 78 199.5t193 84.5q53 0 98 -19q4 43 27 87q-60 21 -125 21q-154 0 -257.5 -108.5t-103.5 -263.5t103.5 -261t257.5 -106t257.5 106.5t103.5 260.5z M872 850q2 -24 2 -71q0 -63 -5 -123t-20.5 -132.5t-40.5 -130t-68.5 -106t-100.5 -70.5q21 -3 42 -3h10q219 139 219 411q0 116 -38 225zM872 850q-4 80 -44 171.5t-98 130.5q92 -156 142 -302zM1207 955q0 102 -51 174q-41 -86 -124 -109q-69 -19 -109 -53.5t-40 -99.5 q0 -40 24 -77q74 17 140.5 67t95.5 115q-4 -52 -74.5 -111.5t-138.5 -97.5q52 -52 110 -52q51 0 90 37t60 90q17 43 17 117zM1536 1120v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960q119 0 203.5 -84.5 t84.5 -203.5z" />
|
||||
<glyph unicode="" d="M1279 388q0 22 -22 27q-67 15 -118 59t-80 108q-7 19 -7 25q0 15 19.5 26t43 17t43 20.5t19.5 36.5q0 19 -18.5 31.5t-38.5 12.5q-12 0 -32 -8t-31 -8q-4 0 -12 2q5 95 5 114q0 79 -17 114q-36 78 -103 121.5t-152 43.5q-199 0 -275 -165q-17 -35 -17 -114q0 -19 5 -114 q-4 -2 -14 -2q-12 0 -32 7.5t-30 7.5q-21 0 -38.5 -12t-17.5 -32q0 -21 19.5 -35.5t43 -20.5t43 -17t19.5 -26q0 -6 -7 -25q-64 -138 -198 -167q-22 -5 -22 -27q0 -46 137 -68q2 -5 6 -26t11.5 -30.5t23.5 -9.5q12 0 37.5 4.5t39.5 4.5q35 0 67 -15t54 -32.5t57.5 -32.5 t76.5 -15q43 0 79 15t57.5 32.5t53.5 32.5t67 15q14 0 39.5 -4t38.5 -4q16 0 23 10t11 30t6 25q137 22 137 68zM1536 640q0 -209 -103 -385.5t-279.5 -279.5t-385.5 -103t-385.5 103t-279.5 279.5t-103 385.5t103 385.5t279.5 279.5t385.5 103t385.5 -103t279.5 -279.5 t103 -385.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1664" d="M848 1408q134 1 240.5 -68.5t163.5 -192.5q27 -58 27 -179q0 -47 -9 -191q14 -7 28 -7q18 0 51 13.5t51 13.5q29 0 56 -18t27 -46q0 -32 -31.5 -54t-69 -31.5t-69 -29t-31.5 -47.5q0 -15 12 -43q37 -82 102.5 -150t144.5 -101q28 -12 80 -23q28 -6 28 -35 q0 -70 -219 -103q-7 -11 -11 -39t-14 -46.5t-33 -18.5q-20 0 -62 6.5t-64 6.5q-37 0 -62 -5q-32 -5 -63 -22.5t-58 -38t-58 -40.5t-76 -33.5t-99 -13.5q-52 0 -96.5 13.5t-75 33.5t-57.5 40.5t-58 38t-62 22.5q-26 5 -63 5q-24 0 -65.5 -7.5t-58.5 -7.5q-25 0 -35 18.5 t-14 47.5t-11 40q-219 33 -219 103q0 29 28 35q52 11 80 23q78 32 144.5 101t102.5 150q12 28 12 43q0 28 -31.5 47.5t-69.5 29.5t-69.5 31.5t-31.5 52.5q0 27 26 45.5t55 18.5q15 0 48 -13t53 -13q18 0 32 7q-9 142 -9 190q0 122 27 180q64 137 172 198t264 63z" />
|
||||
<glyph unicode="" d="M1280 388q0 22 -22 27q-67 14 -118 58t-80 109q-7 14 -7 25q0 15 19.5 26t42.5 17t42.5 20.5t19.5 36.5q0 19 -18.5 31.5t-38.5 12.5q-11 0 -31 -8t-32 -8q-4 0 -12 2q5 63 5 115q0 78 -17 114q-36 78 -102.5 121.5t-152.5 43.5q-198 0 -275 -165q-18 -38 -18 -115 q0 -38 6 -114q-10 -2 -15 -2q-11 0 -31.5 8t-30.5 8q-20 0 -37.5 -12.5t-17.5 -32.5q0 -21 19.5 -35.5t42.5 -20.5t42.5 -17t19.5 -26q0 -11 -7 -25q-64 -138 -198 -167q-22 -5 -22 -27q0 -47 138 -69q2 -5 6 -26t11 -30.5t23 -9.5q13 0 38.5 5t38.5 5q35 0 67.5 -15 t54.5 -32.5t57.5 -32.5t76.5 -15q43 0 79 15t57.5 32.5t54 32.5t67.5 15q13 0 39 -4.5t39 -4.5q15 0 22.5 9.5t11.5 31t5 24.5q138 22 138 69zM1536 1120v-960q0 -119 -84.5 -203.5t-203.5 -84.5h-960q-119 0 -203.5 84.5t-84.5 203.5v960q0 119 84.5 203.5t203.5 84.5h960 q119 0 203.5 -84.5t84.5 -203.5z" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
<glyph unicode="" horiz-adv-x="1792" />
|
||||
</font>
|
||||
</defs></svg>
|
||||
|
Before Width: | Height: | Size: 377 KiB After Width: | Height: | Size: 357 KiB |
BIN
web_db/flask/app/static/fonts/fontawesome-webfont.woff
Normal file
BIN
web_db/flask/app/static/fonts/fontawesome-webfont.woff2
Normal file
|
Before Width: | Height: | Size: 3.4 KiB After Width: | Height: | Size: 3.4 KiB |
|
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 16 KiB |
|
Before Width: | Height: | Size: 24 KiB After Width: | Height: | Size: 24 KiB |
|
Before Width: | Height: | Size: 20 KiB After Width: | Height: | Size: 20 KiB |
|
Before Width: | Height: | Size: 21 KiB After Width: | Height: | Size: 21 KiB |
|
Before Width: | Height: | Size: 36 KiB After Width: | Height: | Size: 36 KiB |
|
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 6.2 KiB After Width: | Height: | Size: 6.2 KiB |
|
Before Width: | Height: | Size: 7.8 KiB After Width: | Height: | Size: 7.8 KiB |
|
Before Width: | Height: | Size: 32 KiB After Width: | Height: | Size: 32 KiB |
|
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 16 KiB |
|
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 14 KiB |
|
Before Width: | Height: | Size: 20 KiB After Width: | Height: | Size: 20 KiB |
|
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 21 KiB After Width: | Height: | Size: 21 KiB |
|
Before Width: | Height: | Size: 24 KiB After Width: | Height: | Size: 24 KiB |
|
Before Width: | Height: | Size: 15 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 17 KiB After Width: | Height: | Size: 17 KiB |
|
Before Width: | Height: | Size: 21 KiB After Width: | Height: | Size: 21 KiB |
|
Before Width: | Height: | Size: 17 KiB After Width: | Height: | Size: 17 KiB |
|
Before Width: | Height: | Size: 4.3 KiB After Width: | Height: | Size: 4.3 KiB |
|
Before Width: | Height: | Size: 5.7 KiB After Width: | Height: | Size: 5.7 KiB |
|
Before Width: | Height: | Size: 9.6 KiB After Width: | Height: | Size: 9.6 KiB |
|
Before Width: | Height: | Size: 22 KiB After Width: | Height: | Size: 22 KiB |
|
Before Width: | Height: | Size: 20 KiB After Width: | Height: | Size: 20 KiB |