v0.2.0.5(2024/07/31)

此版本改动较大,公共部分做了规整,放置到新建文件夹 commons 当中,并所有自定义模块引入 logging 模块,记录重要信息
1. [t_change_ui: clibs.py]
   - 调整代码组织结构,新增模块,将公共函数以及类合并入此
   - 将一些常量放入该模块
   - 引入logging/concurrent_log_handler模块,并作初始化操作,供其他模块使用,按50M切割,最多保留10份
   - prj_to_xcore函数设置工程名部分重写,修复了多个prj工程可能不能执行的问题
2. [t_change_ui: openapi.py]
   - 完全重写了 get_from_id 函数,使更精准
   - 在 msg_storage 函数中,增加 logger,保留所有响应消息
   - 删除 heartbeat 函数中的日志保存功能部分
   - 心跳再次修改为 2s...
3. [t_change_ui: aio.py]
   - 增加了日志初始化部分
   - detect_network 函数中修改重新实例化HR间隔为 4s,对应心跳
4. [t_change_ui: do_brake.py]
   - 使用一直打开曲线的方法规避解决了 OOM 的问题,同时修改数据处理方式,只取最后 12s
5. [t_change_ui: do_current.py]
   - 保持电流,只取最后 15s
6. [t_change_ui: all the part]: 引入 commons 包,并定制了 logging 输出,后续持续优化
This commit is contained in:
gitea 2024-07-31 08:05:36 +08:00
parent af68f19d53
commit 04bd1238d2
20 changed files with 424 additions and 586 deletions

4
.gitignore vendored
View File

@ -7,6 +7,8 @@ aio/venv
aio/__pycache__/ aio/__pycache__/
aio/code/automatic_test/__pycache__/ aio/code/automatic_test/__pycache__/
aio/code/data_process/__pycache__/ aio/code/data_process/__pycache__/
aio/assets/templates/c_msg.log aio/assets/templates/c_msg.log*
aio/code/durable_action/__pycache__/ aio/code/durable_action/__pycache__/
aio/assets/templates/durable/ aio/assets/templates/durable/
aio/assets/templates/.__c_msg.lock
aio/code/commons/__pycache__/

View File

@ -31,9 +31,10 @@
### 四、打包方法 ### 四、打包方法
打包时,只需要修改 clibs.py 中的 PREFIX 即可,调试时再修改回来
``` ```
pyinstaller.exe -F --version-file file_version_info.txt -i .\icon.ico .\aio.py -p .\brake.py -p .\current.py pyinstaller --noconfirm --onedir --windowed --optimize 2 --contents-directory . --upx-dir "D:/Syncthing/common/A_Program/upx-4.2.4-win64/" --add-data "C:/Users/Administrator/AppData/Local/Programs/Python/Python312/Lib/site-packages/customtkinter;customtkinter/" --add-data "D:/Syncthing/company/D-测试工作/X-自动化测试/01-AIO/rokae/aio/assets/templates:templates" --version-file ../assets/file_version_info.txt -i ../assets/icon.ico ../code/aio.py -p ../code/data_process/brake.py -p ../code/data_process/iso.py -p ../code/data_process/current.py -p ../code/data_process/wavelogger.py -p ../code/commons/openapi.py -p ../code/commons/clibs.py -p ../code/automatic_test/btn_functions.py -p ../code/automatic_test/do_current.py -p ../code/automatic_test/do_brake.py -p ../code/durable_action/factory_test.py
pyinstaller --noconfirm --onedir --windowed --add-data "C:/Users/Administrator/AppData/Local/Programs/Python/Python312/Lib/site-packages/customtkinter;customtkinter/" --version-file ..\assets\file_version_info.txt -i ..\assets\icon.ico ..\code\aio.py -p ..\code\data_process\brake.py -p ..\code\data_process\iso.py -p ..\code\data_process\current.py -p ..\code\data_process\wavelogger.py
``` ```
--- ---
@ -557,5 +558,28 @@ v0.2.0.3(2024/07/27)
2. [APIs: do_current.py]: 精简程序,解决 OOM 问题 2. [APIs: do_current.py]: 精简程序,解决 OOM 问题
3. [APIs: factory_test.py]: 精简程序,解决 OOM 问题 3. [APIs: factory_test.py]: 精简程序,解决 OOM 问题
4. [APIsL openapi.py] 4. [APIsL openapi.py]
- 心跳修改为 1 s因为 OOM 问题的解决依赖于长久的打开曲线开关,此时对于 hr.c_msg 的定时清理是个挑战,将心跳缩短,有利于清理日志后,避免丢失心跳 - 心跳修改为 1s因为 OOM 问题的解决依赖于长久的打开曲线开关,此时对于 hr.c_msg 的定时清理是个挑战,将心跳缩短,有利于清理日志后,避免丢失心跳
- 新增 diagnosis.save 命令,但是执行时,有问题,待解决 - 新增 diagnosis.save 命令,但是执行时,有问题,待解决
v0.2.0.5(2024/07/31)
此版本改动较大,公共部分做了规整,放置到新建文件夹 commons 当中,并所有自定义模块引入 logging 模块,记录重要信息
1. [t_change_ui: clibs.py]
- 调整代码组织结构,新增模块,将公共函数以及类合并入此
- 将一些常量放入该模块
- 引入logging/concurrent_log_handler模块并作初始化操作供其他模块使用按50M切割最多保留10份
- prj_to_xcore函数设置工程名部分重写修复了多个prj工程可能不能执行的问题
2. [t_change_ui: openapi.py]
- 完全重写了 get_from_id 函数,使更精准
- 在 msg_storage 函数中,增加 logger保留所有响应消息
- 删除 heartbeat 函数中的日志保存功能部分
- 心跳再次修改为 2s...
3. [t_change_ui: aio.py]
- 增加了日志初始化部分
- detect_network 函数中修改重新实例化HR间隔为 4s对应心跳
4. [t_change_ui: do_brake.py]
- 使用一直打开曲线的方法规避解决了 OOM 的问题,同时修改数据处理方式,只取最后 12s
5. [t_change_ui: do_current.py]
- 保持电流,只取最后 15s
6. [t_change_ui: all the part]: 引入 commons 包,并定制了 logging 输出,后续持续优化

View File

@ -6,8 +6,8 @@ VSVersionInfo(
ffi=FixedFileInfo( ffi=FixedFileInfo(
# filevers and prodvers should be always a tuple with four items: (1, 2, 3, 4) # filevers and prodvers should be always a tuple with four items: (1, 2, 3, 4)
# Set not needed items to zero 0. # Set not needed items to zero 0.
filevers=(0, 2, 0, 3), filevers=(0, 2, 0, 5),
prodvers=(0, 2, 0, 3), prodvers=(0, 2, 0, 5),
# Contains a bitmask that specifies the valid bits 'flags'r # Contains a bitmask that specifies the valid bits 'flags'r
mask=0x3f, mask=0x3f,
# Contains a bitmask that specifies the Boolean attributes of the file. # Contains a bitmask that specifies the Boolean attributes of the file.
@ -31,12 +31,12 @@ VSVersionInfo(
'040904b0', '040904b0',
[StringStruct('CompanyName', 'Rokae - https://www.rokae.com/'), [StringStruct('CompanyName', 'Rokae - https://www.rokae.com/'),
StringStruct('FileDescription', 'All in one automatic toolbox'), StringStruct('FileDescription', 'All in one automatic toolbox'),
StringStruct('FileVersion', '0.2.0.3 (2024-07-27)'), StringStruct('FileVersion', '0.2.0.5 (2024-08-02)'),
StringStruct('InternalName', 'AIO.exe'), StringStruct('InternalName', 'AIO.exe'),
StringStruct('LegalCopyright', '© 2024-2024 Manford Fan'), StringStruct('LegalCopyright', '© 2024-2024 Manford Fan'),
StringStruct('OriginalFilename', 'AIO.exe'), StringStruct('OriginalFilename', 'AIO.exe'),
StringStruct('ProductName', 'AIO'), StringStruct('ProductName', 'AIO'),
StringStruct('ProductVersion', '0.2.0.3 (2024-07-27)')]) StringStruct('ProductVersion', '0.2.0.5 (2024-08-02)')])
]), ]),
VarFileInfo([VarStruct('Translation', [1033, 1200])]) VarFileInfo([VarStruct('Translation', [1033, 1200])])
] ]

View File

@ -1,9 +1,9 @@
openpyxl==3.1.2 concurrent_log_handler==0.9.25
pdfplumber==0.11.0
customtkinter==5.2.2 customtkinter==5.2.2
Jinja2==3.1.4 matplotlib==3.9.1
lxml==5.2.2 numpy==2.0.1
numpy==1.26.4 openpyxl==3.1.2
pandas==2.2.2 pandas==2.2.2
pillow==10.3.0 paramiko==3.4.0
pyinstaller==6.7.0 pdfplumber==0.11.0
pymodbus==3.6.9

View File

@ -1 +1 @@
0.2.0.3 @ 07/27/2024 0.2.0.5 @ 08/02/2024

View File

@ -1 +0,0 @@
__all__ = ['automatic_test', 'data_process']

View File

@ -1,49 +1,37 @@
import tkinter import tkinter
from os.path import exists, dirname from os.path import exists
from os import getcwd from os import getcwd, remove
from threading import Thread from threading import Thread
import tkinter.messagebox import tkinter.messagebox
import customtkinter import customtkinter
from time import time, strftime, localtime, sleep from time import time, strftime, localtime, sleep
from urllib.request import urlopen from urllib.request import urlopen
from socket import setdefaulttimeout
from matplotlib.backends.backend_tkagg import FigureCanvasTkAgg from matplotlib.backends.backend_tkagg import FigureCanvasTkAgg
from data_process import * from data_process import brake, current, iso, wavelogger
from automatic_test import * from automatic_test import do_current, do_brake, btn_functions
from durable_action import * from durable_action import factory_test
import openapi from commons import openapi, clibs
import matplotlib.pyplot as plt from matplotlib.pyplot import rcParams, figure, subplots_adjust
from matplotlib import use from matplotlib import use
from pandas import DataFrame, read_excel from pandas import DataFrame, read_excel
import logging
with open(clibs.log_data, 'w') as _:
for i in range(1, 11):
try:
remove(f'{clibs.log_data}.{i}')
except FileNotFoundError:
pass
logger = logging.getLogger(__file__)
logger.info("日志文件初始化完成...")
use('Agg') use('Agg')
heartbeat = f'{dirname(__file__)}/../assets/templates/heartbeat'
durable_data_current_xlsx = f'{dirname(__file__)}/../assets/templates/durable/durable_data_current.xlsx'
durable_data_current_max_xlsx = f'{dirname(__file__)}/../assets/templates/durable/durable_data_current_max.xlsx'
customtkinter.set_appearance_mode("System") # Modes: "System" (standard), "Dark", "Light" customtkinter.set_appearance_mode("System") # Modes: "System" (standard), "Dark", "Light"
customtkinter.set_default_color_theme("blue") # Themes: "blue" (standard), "green", "dark-blue" customtkinter.set_default_color_theme("blue") # Themes: "blue" (standard), "green", "dark-blue"
customtkinter.set_widget_scaling(1.1) # widget dimensions and text size customtkinter.set_widget_scaling(1.1) # widget dimensions and text size
customtkinter.set_window_scaling(1.1) # window geometry dimensions customtkinter.set_window_scaling(1.1) # window geometry dimensions
setdefaulttimeout(3)
# global vars # global vars
durable_data_current = {
'time': list(range(1, 19)),
'axis1': [0 for _ in range(18)],
'axis2': [0 for _ in range(18)],
'axis3': [0 for _ in range(18)],
'axis4': [0 for _ in range(18)],
'axis5': [0 for _ in range(18)],
'axis6': [0 for _ in range(18)],
}
durable_data_current_max = {
'time': list(range(1, 19)),
'axis1': [0 for _ in range(18)],
'axis2': [0 for _ in range(18)],
'axis3': [0 for _ in range(18)],
'axis4': [0 for _ in range(18)],
'axis5': [0 for _ in range(18)],
'axis6': [0 for _ in range(18)],
}
btns_func = { btns_func = {
'start': {'btn': '', 'row': 1, 'text': '开始运行'}, 'start': {'btn': '', 'row': 1, 'text': '开始运行'},
'check': {'btn': '', 'row': 2, 'text': '检查参数'}, 'check': {'btn': '', 'row': 2, 'text': '检查参数'},
@ -79,6 +67,7 @@ class App(customtkinter.CTk):
self.flg = 0 self.flg = 0
self.df_copy = None self.df_copy = None
self.old_curve = None self.old_curve = None
self.myThread = None
# ===================================================================== # =====================================================================
# configure window # configure window
self.title("AIO - All in one automatic toolbox") self.title("AIO - All in one automatic toolbox")
@ -202,37 +191,36 @@ class App(customtkinter.CTk):
if cur_vers.strip() != new_vers.strip(): if cur_vers.strip() != new_vers.strip():
msg = f"""当前版本:{cur_vers}\n更新版本:{new_vers}\n\n请及时前往钉盘更新~~~""" msg = f"""当前版本:{cur_vers}\n更新版本:{new_vers}\n\n请及时前往钉盘更新~~~"""
tkinter.messagebox.showwarning(title="版本更新", message=msg) tkinter.messagebox.showwarning(title="版本更新", message=msg)
self.destroy()
except: except:
tkinter.messagebox.showwarning(title="版本更新", message="连接服务器失败,无法确认当前是否是最新版本......") tkinter.messagebox.showwarning(title="版本更新", message="连接服务器失败,无法确认当前是否是最新版本......")
# functions below ↓ ---------------------------------------------------------------------------------------- # functions below ↓ ----------------------------------------------------------------------------------------
def create_canvas(self, figure): def create_canvas(self, _figure):
self.canvas = FigureCanvasTkAgg(figure, self.tabview.tab('Durable Action')) self.canvas = FigureCanvasTkAgg(_figure, self.tabview.tab('Durable Action'))
self.canvas.draw() self.canvas.draw()
self.canvas.get_tk_widget().configure(height=600) self.canvas.get_tk_widget().configure(height=600)
self.canvas.get_tk_widget().grid(row=3, column=1, rowspan=3, columnspan=13, padx=20, pady=10, sticky="nsew") self.canvas.get_tk_widget().grid(row=3, column=1, rowspan=3, columnspan=13, padx=20, pady=10, sticky="nsew")
def create_plot(self): def create_plot(self):
plt.rcParams['font.sans-serif'] = ['SimHei'] rcParams['font.sans-serif'] = ['SimHei']
plt.rcParams['axes.unicode_minus'] = False rcParams['axes.unicode_minus'] = False
plt.rcParams['figure.dpi'] = 100 rcParams['figure.dpi'] = 100
plt.rcParams['font.size'] = 14 rcParams['font.size'] = 14
plt.rcParams['lines.marker'] = 'o' rcParams['lines.marker'] = 'o'
curvesel = widgits_da['curvesel']['optionmenu'].get() curvesel = widgits_da['curvesel']['optionmenu'].get()
while True: while True:
if not self.hr.durable_lock: if not self.hr.durable_lock:
self.hr.durable_lock = 1 self.hr.durable_lock = 1
if curvesel == 'device_servo_trq_feedback': if curvesel == 'device_servo_trq_feedback':
df = read_excel(durable_data_current_xlsx) df = read_excel(clibs.durable_data_current_xlsx)
_title = 'device_servo_trq_feedback' _title = 'device_servo_trq_feedback'
elif curvesel == '[max] device_servo_trq_feedback': elif curvesel == '[max] device_servo_trq_feedback':
_title = '[max] device_servo_trq_feedback' _title = '[max] device_servo_trq_feedback'
df = read_excel(durable_data_current_max_xlsx) df = read_excel(clibs.durable_data_current_max_xlsx)
else: else:
_title = 'device_servo_trq_feedback' _title = 'device_servo_trq_feedback'
df = read_excel(durable_data_current_xlsx) df = read_excel(clibs.durable_data_current_xlsx)
self.hr.durable_lock = 0 self.hr.durable_lock = 0
break break
else: else:
@ -242,12 +230,12 @@ class App(customtkinter.CTk):
self.flg = 1 self.flg = 1
self.df_copy = df.copy() self.df_copy = df.copy()
self.old_curve = widgits_da['curvesel']['optionmenu'].get() self.old_curve = widgits_da['curvesel']['optionmenu'].get()
figure = plt.figure(frameon=True, facecolor='#E9E9E9') _figure = figure(frameon=True, facecolor='#E9E9E9')
plt.subplots_adjust(left=0.04, right=0.98, bottom=0.1, top=0.95) subplots_adjust(left=0.04, right=0.98, bottom=0.1, top=0.95)
_ = df['time'].to_list() _ = df['time'].to_list()
_xticks = [str(_i) for _i in _] _xticks = [str(_i) for _i in _]
ax = figure.add_subplot(1, 1, 1) ax = _figure.add_subplot(1, 1, 1)
ax.set_xticks(range(len(_xticks))) ax.set_xticks(range(len(_xticks)))
ax.set_xticklabels(_xticks) ax.set_xticklabels(_xticks)
@ -258,7 +246,7 @@ class App(customtkinter.CTk):
df.plot(grid=True, x='time', y='axis5', ax=ax) df.plot(grid=True, x='time', y='axis5', ax=ax)
df.plot(grid=True, x='time', y='axis6', ax=ax, title=_title, legend='upper left', rot=30) df.plot(grid=True, x='time', y='axis6', ax=ax, title=_title, legend='upper left', rot=30)
self.create_canvas(figure) self.create_canvas(_figure)
def thread_it(self, func, *args): def thread_it(self, func, *args):
""" 将函数打包进线程 """ """ 将函数打包进线程 """
@ -273,7 +261,7 @@ class App(customtkinter.CTk):
self.seg_button.configure(state='disabled') self.seg_button.configure(state='disabled')
# self.tabview.configure(state='disabled') # self.tabview.configure(state='disabled')
self.textbox.delete(index1='1.0', index2='end') self.textbox.delete(index1='1.0', index2='end')
with open(heartbeat, 'r', encoding='utf-8') as f_h: with open(clibs.heartbeat, 'r', encoding='utf-8') as f_h:
c_state = f_h.read().strip() c_state = f_h.read().strip()
if c_state == '0' and value != '功能切换': if c_state == '0' and value != '功能切换':
@ -288,12 +276,12 @@ class App(customtkinter.CTk):
# self.tabview.configure(state='normal') # self.tabview.configure(state='normal')
def detect_network(self): def detect_network(self):
df = DataFrame(durable_data_current) df = DataFrame(clibs.durable_data_current)
df.to_excel(durable_data_current_xlsx, index=False) df.to_excel(clibs.durable_data_current_xlsx, index=False)
df = DataFrame(durable_data_current_max) df = DataFrame(clibs.durable_data_current_max)
df.to_excel(durable_data_current_max_xlsx, index=False) df.to_excel(clibs.durable_data_current_max_xlsx, index=False)
with open(heartbeat, "w", encoding='utf-8') as f_hb: with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb:
f_hb.write('0') f_hb.write('0')
self.hr = openapi.HmiRequest(self.write2textbox) self.hr = openapi.HmiRequest(self.write2textbox)
self.md = openapi.ModbusRequest(self.write2textbox) self.md = openapi.ModbusRequest(self.write2textbox)
@ -302,14 +290,14 @@ class App(customtkinter.CTk):
if self.tabview.get() == 'Durable Action': if self.tabview.get() == 'Durable Action':
self.create_plot() self.create_plot()
with open(heartbeat, 'r', encoding='utf-8') as f_hb: with open(clibs.heartbeat, 'r', encoding='utf-8') as f_hb:
c_state = f_hb.read().strip() c_state = f_hb.read().strip()
pb_color = 'green' if c_state == '1' else 'red' pb_color = 'green' if c_state == '1' else 'red'
self.progressbar.configure(progress_color=pb_color) self.progressbar.configure(progress_color=pb_color)
self.progressbar_da.configure(progress_color=pb_color) self.progressbar_da.configure(progress_color=pb_color)
if c_state == '0': if c_state == '0':
self.hr.t_bool = False self.hr.t_bool = False
sleep(3) sleep(4)
del self.hr del self.hr
self.hr = openapi.HmiRequest(self.write2textbox) self.hr = openapi.HmiRequest(self.write2textbox)
sleep(3) sleep(3)
@ -621,10 +609,10 @@ class App(customtkinter.CTk):
def pre_warning(self): def pre_warning(self):
if self.tabview.get() == 'Durable Action': if self.tabview.get() == 'Durable Action':
df = DataFrame(durable_data_current) df = DataFrame(clibs.durable_data_current)
df.to_excel(durable_data_current_xlsx, index=False) df.to_excel(clibs.durable_data_current_xlsx, index=False)
df = DataFrame(durable_data_current_max) df = DataFrame(clibs.durable_data_current_max)
df.to_excel(durable_data_current_max_xlsx, index=False) df.to_excel(clibs.durable_data_current_max_xlsx, index=False)
if tkinter.messagebox.askyesno(title="开始运行", message="确认机器已按照测试规范更新固件,并提按照测试机型前修改好工程?"): if tkinter.messagebox.askyesno(title="开始运行", message="确认机器已按照测试规范更新固件,并提按照测试机型前修改好工程?"):
pass pass

View File

@ -1 +0,0 @@
__all__ = ['btn_functions', 'do_brake', 'do_current']

View File

@ -1,61 +1,54 @@
from json import loads from json import loads
from sys import argv from sys import argv
from logging import getLogger
from commons import clibs
tab_name = clibs.tab_names['at']
def execution(cmd, hr, w2t, **kwargs): logger = getLogger(__file__)
_id = hr.execution(cmd, **kwargs)
_msg = hr.get_from_id(_id)
if not _msg:
w2t(f"无法获取{_id}请求的响应信息", 0, 6, 'red', tab_name='Automatic Test')
else:
_response = loads(_msg)
if not _response:
w2t(f"无法获取{id}请求的响应信息", 0, 1, 'red', tab_name='Automatic Test')
return _response
def trigger_estop(md, w2t): def trigger_estop(md, w2t):
md.trigger_estop() md.trigger_estop()
w2t("触发急停成功,可点击机器状态验证。", 0, 0, 'green', 'Automatic Test') w2t("触发急停成功,可点击机器状态验证。", 0, 0, 'green', tab_name)
def reset_estop(md, w2t): def reset_estop(md, w2t):
md.reset_estop() md.reset_estop()
w2t("恢复急停成功,可点击机器状态验证。", 0, 0, 'green', 'Automatic Test') w2t("恢复急停成功,可点击机器状态验证。", 0, 0, 'green', tab_name)
def get_state(hr, w2t): def get_state(hr, w2t):
# 获取机器状态 # 获取机器状态
_response = execution('state.get_state', hr, w2t) _response = clibs.execution('state.get_state', hr, w2t, tab_name)
stat_desc = {'engine': '上电状态', 'operate': '操作模式', 'rc_state': '控制器状态', 'robot_action': '机器人动作', 'safety_mode': '安全模式', 'servo_mode': '伺服工作模式', 'task_space': '工作任务空间'} stat_desc = {'engine': '上电状态', 'operate': '操作模式', 'rc_state': '控制器状态', 'robot_action': '机器人动作', 'safety_mode': '安全模式', 'servo_mode': '伺服工作模式', 'task_space': '工作任务空间'}
for component, state in _response['data'].items(): for component, state in _response['data'].items():
w2t(f"{stat_desc[component]}: {state}", tab_name='Automatic Test') w2t(f"{stat_desc[component]}: {state}", tab_name=tab_name)
# 获取设备伺服信息 # 获取设备伺服信息
_response = execution('device.get_params', hr, w2t) _response = clibs.execution('device.get_params', hr, w2t, tab_name)
dev_desc = {0: '伺服版本', 1: '伺服参数', 2: '安全板固件', 3: '控制器', 4: '通讯总线', 5: '解释器', 6: '运动控制', 8: '力控版本', 9: '末端固件', 10: '机型文件', 11: '环境包'} dev_desc = {0: '伺服版本', 1: '伺服参数', 2: '安全板固件', 3: '控制器', 4: '通讯总线', 5: '解释器', 6: '运动控制', 8: '力控版本', 9: '末端固件', 10: '机型文件', 11: '环境包'}
dev_vers = {} dev_vers = {}
for device in _response['data']['devices']: for device in _response['data']['devices']:
dev_vers[device['type']] = device['version'] dev_vers[device['type']] = device['version']
for i in sorted(dev_desc.keys()): for i in sorted(dev_desc.keys()):
w2t(f"{dev_desc[i]}: {dev_vers[i]}", tab_name='Automatic Test') w2t(f"{dev_desc[i]}: {dev_vers[i]}", tab_name=tab_name)
# 设置示教器模式 # 设置示教器模式
_response = execution('state.set_tp_mode', hr, w2t, tp_mode='without') _response = clibs.execution('state.set_tp_mode', hr, w2t, tab_name, tp_mode='without')
def warning_info(hr, w2t): def warning_info(hr, w2t):
for msg in hr.c_msg: for msg in hr.c_msg:
if 'alarm' in msg.lower(): if 'alarm' in msg.lower():
w2t(str(loads(msg)), tab_name='Automatic Test') w2t(str(loads(msg)), tab_name=tab_name)
for msg in hr.c_msg_xs: for msg in hr.c_msg_xs:
if 'alarm' in msg.lower(): if 'alarm' in msg.lower():
w2t(str(loads(msg)), tab_name='Automatic Test') w2t(str(loads(msg)), tab_name=tab_name)
def main(hr, md, func, w2t): def main(hr, md, func, w2t):
if hr is None: if hr is None:
w2t("无法连接机器人检查是否已经使用Robot Assist软件连接机器重试中...", 0, 49, 'red', tab_name='Automatic Test') w2t("无法连接机器人检查是否已经使用Robot Assist软件连接机器重试中...", 0, 49, 'red', tab_name)
# func: get_state/ # func: get_state/
match func: match func:
case 'trigger_estop': case 'trigger_estop':

View File

@ -1,30 +1,15 @@
from time import sleep, time, strftime, localtime from time import sleep, time, strftime, localtime
from sys import argv from sys import argv
from os import scandir, mkdir from os import mkdir
from os.path import exists
from paramiko import SSHClient, AutoAddPolicy from paramiko import SSHClient, AutoAddPolicy
from json import loads from json import loads
from openpyxl import load_workbook from openpyxl import load_workbook
import pandas from pandas import DataFrame, concat
from logging import getLogger
from commons import clibs
RADIAN = 57.3 # 180 / 3.1415926 tab_name = clibs.tab_names['at']
tab_name = 'Automatic Test' logger = getLogger(__file__)
def traversal_files(path, w2t):
if not exists(path):
msg = f'数据文件夹{path}不存在,请确认后重试......'
w2t(msg, 0, 1, 'red', tab_name)
else:
dirs = []
files = []
for item in scandir(path):
if item.is_dir():
dirs.append(item.path)
elif item.is_file():
files.append(item.path)
return dirs, files
def check_files(path, loadsel, data_dirs, data_files, w2t): def check_files(path, loadsel, data_dirs, data_files, w2t):
@ -72,51 +57,11 @@ def check_files(path, loadsel, data_dirs, data_files, w2t):
w2t(' 1. configs.xlsx\n 2. reach33/reach66/reach100_xxxx.xlsx\n 3. xxxx.zip', 0, 1, 'red', tab_name) w2t(' 1. configs.xlsx\n 2. reach33/reach66/reach100_xxxx.xlsx\n 3. xxxx.zip', 0, 1, 'red', tab_name)
def prj_to_xcore(prj_file):
ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect('192.168.0.160', 22, username='luoshi', password='luoshi2019')
sftp = ssh.open_sftp()
sftp.put(prj_file, '/tmp/target.zip')
cmd = 'cd /tmp; rm -rf target/; mkdir target; unzip -d target/ -q target.zip; '
cmd += 'chmod 777 -R target/; rm target.zip'
ssh.exec_command(cmd)
cmd = 'sudo rm -rf /home/luoshi/bin/controller/projects/target; '
cmd += 'sudo mv /tmp/target/ /home/luoshi/bin/controller/projects/'
stdin, stdout, stderr = ssh.exec_command(cmd, get_pty=True)
stdin.write('luoshi2019' + '\n')
stdin.flush()
print(stdout.read().decode()) # 必须得输出一下stdout才能正确执行sudo
print(stderr.read().decode()) # 顺便也执行以下stderr
cmd = 'cd /home/luoshi/bin/controller/; '
cmd += 'sudo mv projects/target/_build/*.prj projects/target/_build/target.prj'
stdin, stdout, stderr = ssh.exec_command(cmd, get_pty=True)
stdin.write('luoshi2019' + '\n')
stdin.flush()
print(stdout.read().decode()) # 必须得输出一下stdout才能正确执行sudo
print(stderr.read().decode()) # 顺便也执行以下stderr
ssh.close()
def execution(cmd, hr, w2t, **kwargs):
_id = hr.execution(cmd, **kwargs)
_msg = hr.get_from_id(_id)
if not _msg:
w2t(f"无法获取{_id}请求的响应信息", 0, 6, 'red', tab_name)
else:
_response = loads(_msg)
if not _response:
w2t(f"无法获取{id}请求的响应信息", 0, 1, 'red', tab_name)
return _response
def gen_result_file(path, curve_data, axis, _reach, _load, _speed, count): def gen_result_file(path, curve_data, axis, _reach, _load, _speed, count):
_d2d_vel = {'hw_joint_vel_feedback': []} _d2d_vel = {'hw_joint_vel_feedback': []}
_d2d_trq = {'device_servo_trq_feedback': []} _d2d_trq = {'device_servo_trq_feedback': []}
_d2d_stop = {'device_safety_estop': []} _d2d_stop = {'device_safety_estop': []}
for data in curve_data: for data in curve_data[-240:]:
dict_results = data['data'] dict_results = data['data']
for item in dict_results: for item in dict_results:
try: try:
@ -129,13 +74,11 @@ def gen_result_file(path, curve_data, axis, _reach, _load, _speed, count):
_d2d_trq['device_servo_trq_feedback'].extend(item['value']) _d2d_trq['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 0 and item.get('name', None) == 'device_safety_estop': elif item.get('channel', None) == 0 and item.get('name', None) == 'device_safety_estop':
_d2d_stop['device_safety_estop'].extend(item['value']) _d2d_stop['device_safety_estop'].extend(item['value'])
if len(_d2d_trq['device_servo_trq_feedback']) / 1000 > 10:
break
df1 = pandas.DataFrame.from_dict(_d2d_vel) df1 = DataFrame.from_dict(_d2d_vel)
df2 = pandas.DataFrame.from_dict(_d2d_trq) df2 = DataFrame.from_dict(_d2d_trq)
df3 = pandas.DataFrame.from_dict(_d2d_stop) df3 = DataFrame.from_dict(_d2d_stop)
df = pandas.concat([df1, df2, df3], axis=1) df = concat([df1, df2, df3], axis=1)
_filename = f"{path}\\j{axis}\\reach{_reach}_load{_load}_speed{_speed}\\reach{_reach}_load{_load}_speed{_speed}_{count}.data" _filename = f"{path}\\j{axis}\\reach{_reach}_load{_load}_speed{_speed}\\reach{_reach}_load{_load}_speed{_speed}_{count}.data"
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
@ -168,14 +111,13 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
else: else:
w2t("configs.xlsx中Target页面A1单元格填写不正确检查后重新运行...", 0, 111, 'red', tab_name) w2t("configs.xlsx中Target页面A1单元格填写不正确检查后重新运行...", 0, 111, 'red', tab_name)
_response = execution('diagnosis.open', hr, w2t, open=True, display_open=True) clibs.execution('diagnosis.open', hr, w2t, tab_name, open=True, display_open=True)
_response = execution('diagnosis.set_params', hr, w2t, display_pdo_params=display_pdo_params) clibs.execution('diagnosis.set_params', hr, w2t, tab_name, display_pdo_params=display_pdo_params)
for condition in result_dirs: for condition in result_dirs:
_reach = condition.split('_')[0].removeprefix('reach') _reach = condition.split('_')[0].removeprefix('reach')
_load = condition.split('_')[1].removeprefix('load') _load = condition.split('_')[1].removeprefix('load')
_speed = condition.split('_')[2].removeprefix('speed') _speed = condition.split('_')[2].removeprefix('speed')
# if _speed != '100' or _reach != '100':
# continue
for axis in range(1, 4): for axis in range(1, 4):
md.write_axis(axis) md.write_axis(axis)
speed_max = 0 speed_max = 0
@ -224,11 +166,11 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
print(stderr.read().decode()) # 顺便也执行以下stderr print(stderr.read().decode()) # 顺便也执行以下stderr
# 3. reload工程后pp2main并且自动模式和上电最后运行程序 # 3. reload工程后pp2main并且自动模式和上电最后运行程序
_response = execution('overview.reload', hr, w2t, prj_path=prj_path, tasks=['brake', 'stop0_related']) clibs.execution('overview.reload', hr, w2t, tab_name, prj_path=prj_path, tasks=['brake', 'stop0_related'])
_response = execution('rl_task.pp_to_main', hr, w2t, tasks=['brake', 'stop0_related']) clibs.execution('rl_task.pp_to_main', hr, w2t, tab_name, tasks=['brake', 'stop0_related'])
_response = execution('state.switch_auto', hr, w2t) clibs.execution('state.switch_auto', hr, w2t, tab_name)
_response = execution('state.switch_motor_on', hr, w2t) clibs.execution('state.switch_motor_on', hr, w2t, tab_name)
_response = execution('rl_task.run', hr, w2t, tasks=['brake', 'stop0_related']) clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['brake', 'stop0_related'])
_t_start = time() _t_start = time()
while True: while True:
if md.read_ready_to_go() == 1: if md.read_ready_to_go() == 1:
@ -241,8 +183,7 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
sleep(1) sleep(1)
# 4. 打开诊断曲线并执行采集之后触发软急停关闭曲线采集找出最大速度传递给RL程序最后清除相关记录 # 4. 打开诊断曲线并执行采集之后触发软急停关闭曲线采集找出最大速度传递给RL程序最后清除相关记录
sleep(get_init_speed) # 获取实际最大速度可通过configs.xlsx配置 sleep(get_init_speed) # 获取实际最大速度可通过configs.xlsx配置
_response = execution('rl_task.stop', hr, w2t, tasks=['brake']) clibs.execution('rl_task.stop', hr, w2t, tab_name, tasks=['brake'])
# sleep(1)
# 找出最大速度 # 找出最大速度
_c_msg = hr.c_msg.copy() _c_msg = hr.c_msg.copy()
for _msg in _c_msg: for _msg in _c_msg:
@ -250,7 +191,7 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
dict_results = loads(_msg)['data'] dict_results = loads(_msg)['data']
for item in dict_results: for item in dict_results:
if item.get('channel', None) == axis-1 and item.get('name', None) == 'hw_joint_vel_feedback': if item.get('channel', None) == axis-1 and item.get('name', None) == 'hw_joint_vel_feedback':
_ = RADIAN * sum(item['value']) / len(item['value']) _ = clibs.RADIAN * sum(item['value']) / len(item['value'])
if ws.cell(row=1, column=1).value == 'positive': if ws.cell(row=1, column=1).value == 'positive':
speed_max = max(_, speed_max) speed_max = max(_, speed_max)
elif ws.cell(row=1, column=1).value == 'negative': elif ws.cell(row=1, column=1).value == 'negative':
@ -264,8 +205,8 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
md.write_speed_max(speed_max) md.write_speed_max(speed_max)
hr.c_msg_xs.clear() hr.c_msg_xs.clear()
if len(hr.c_msg) > 240: if len(hr.c_msg) > 270:
del hr.c_msg[240:] del hr.c_msg[270:]
if speed_max < 10: if speed_max < 10:
md.clear_alarm() md.clear_alarm()
@ -278,11 +219,11 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
md.reset_estop() # 其实没必要 md.reset_estop() # 其实没必要
md.clear_alarm() md.clear_alarm()
_response = execution('overview.reload', hr, w2t, prj_path=prj_path, tasks=['brake', 'stop0_related']) clibs.execution('overview.reload', hr, w2t, tab_name, prj_path=prj_path, tasks=['brake', 'stop0_related'])
_response = execution('rl_task.pp_to_main', hr, w2t, tasks=['brake', 'stop0_related']) clibs.execution('rl_task.pp_to_main', hr, w2t, tab_name, tasks=['brake', 'stop0_related'])
_response = execution('state.switch_auto', hr, w2t) clibs.execution('state.switch_auto', hr, w2t, tab_name)
_response = execution('state.switch_motor_on', hr, w2t) clibs.execution('state.switch_motor_on', hr, w2t, tab_name)
_response = execution('rl_task.run', hr, w2t, tasks=['brake', 'stop0_related']) clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['brake', 'stop0_related'])
for i in range(3): for i in range(3):
if md.read_ready_to_go() == 1: if md.read_ready_to_go() == 1:
md.write_act(1) md.write_act(1)
@ -316,8 +257,8 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
curve_data.insert(0, loads(_msg)) curve_data.insert(0, loads(_msg))
else: else:
hr.c_msg_xs.clear() hr.c_msg_xs.clear()
if len(hr.c_msg) > 240: if len(hr.c_msg) > 270:
del hr.c_msg[240:] del hr.c_msg[270:]
gen_result_file(path, curve_data, axis, _reach, _load, _speed, count) gen_result_file(path, curve_data, axis, _reach, _load, _speed, count)
else: else:
w2t(f"\n{loadsel.removeprefix('tool')}%负载的制动性能测试执行完毕,如需采集其他负载,须切换负载类型,并更换其他负载,重新执行。", 0, 0, 'green', tab_name) w2t(f"\n{loadsel.removeprefix('tool')}%负载的制动性能测试执行完毕,如需采集其他负载,须切换负载类型,并更换其他负载,重新执行。", 0, 0, 'green', tab_name)
@ -325,9 +266,9 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
def main(path, hr, md, loadsel, w2t): def main(path, hr, md, loadsel, w2t):
_s_time = time() _s_time = time()
data_dirs, data_files = traversal_files(path, w2t) data_dirs, data_files = clibs.traversal_files(path, w2t)
config_file, reach33, reach66, reach100, prj_file, result_dirs = check_files(path, loadsel, data_dirs, data_files, w2t) config_file, reach33, reach66, reach100, prj_file, result_dirs = check_files(path, loadsel, data_dirs, data_files, w2t)
prj_to_xcore(prj_file) clibs.prj_to_xcore(prj_file)
run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t) run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t)
_e_time = time() _e_time = time()
time_total = _e_time - _s_time time_total = _e_time - _s_time

View File

@ -1,12 +1,14 @@
import os from os import mkdir
from time import sleep, time from time import sleep, time
from sys import argv from sys import argv
from os import scandir
from os.path import exists
from paramiko import SSHClient, AutoAddPolicy from paramiko import SSHClient, AutoAddPolicy
from json import loads from json import loads
import pandas from pandas import DataFrame, concat
from logging import getLogger
from commons import clibs
logger = getLogger(__file__)
tab_name = clibs.tab_names['at']
display_pdo_params = [ display_pdo_params = [
{"name": "hw_joint_vel_feedback", "channel": 0}, {"name": "hw_joint_vel_feedback", "channel": 0},
{"name": "hw_joint_vel_feedback", "channel": 1}, {"name": "hw_joint_vel_feedback", "channel": 1},
@ -22,26 +24,11 @@ display_pdo_params = [
{"name": "device_servo_trq_feedback", "channel": 5}, {"name": "device_servo_trq_feedback", "channel": 5},
] ]
def traversal_files(path, w2t):
if not exists(path):
msg = f'数据文件夹{path}不存在,请确认后重试......'
w2t(msg, 0, 1, 'red', tab_name='Automatic Test')
else:
dirs = []
files = []
for item in scandir(path):
if item.is_dir():
dirs.append(item.path)
elif item.is_file():
files.append(item.path)
return dirs, files
def check_files(path, loadsel, data_dirs, data_files, w2t): def check_files(path, loadsel, data_dirs, data_files, w2t):
if len(data_dirs) != 0 or len(data_files) != 3: if len(data_dirs) != 0 or len(data_files) != 3:
w2t('初始路径下不允许有文件夹,且初始路径下只能存在如下三个文件,确认后重新运行!', 0, 0, 'red', tab_name='Automatic Test') w2t('初始路径下不允许有文件夹,且初始路径下只能存在如下三个文件,确认后重新运行!', 0, 0, 'red', tab_name)
w2t(' 1. configs.xlsx\n 2. T_电机电流.xlsx\n 3. xxxx.zip', 0, 1, 'red', tab_name='Automatic Test') w2t(' 1. configs.xlsx\n 2. T_电机电流.xlsx\n 3. xxxx.zip', 0, 1, 'red', tab_name)
config_file = current_file = prj_file = None config_file = current_file = prj_file = None
for data_file in data_files: for data_file in data_files:
@ -53,62 +40,22 @@ def check_files(path, loadsel, data_dirs, data_files, w2t):
elif filename.endswith('.zip'): elif filename.endswith('.zip'):
prj_file = data_file prj_file = data_file
else: else:
w2t('初始路径下不允许有文件夹,且初始路径下只能存在如下三个文件,确认后重新运行!', 0, 0, 'red', tab_name='Automatic Test') w2t('初始路径下不允许有文件夹,且初始路径下只能存在如下三个文件,确认后重新运行!', 0, 0, 'red', tab_name)
w2t(' 1. configs.xlsx\n 2. T_电机电流.xlsx\n 3. xxxx.zip', 0, 1, 'red', tab_name='Automatic Test') w2t(' 1. configs.xlsx\n 2. T_电机电流.xlsx\n 3. xxxx.zip', 0, 1, 'red', tab_name)
if config_file and current_file and prj_file: if config_file and current_file and prj_file:
w2t("数据目录合规性检查结束,未发现问题......", tab_name='Automatic Test') w2t("数据目录合规性检查结束,未发现问题......", tab_name=tab_name)
if loadsel == 'tool100': if loadsel == 'tool100':
os.mkdir(f"{path}\\single") mkdir(f"{path}\\single")
os.mkdir(f"{path}\\s_1") mkdir(f"{path}\\s_1")
os.mkdir(f"{path}\\s_2") mkdir(f"{path}\\s_2")
os.mkdir(f"{path}\\s_3") mkdir(f"{path}\\s_3")
elif loadsel == 'inertia': elif loadsel == 'inertia':
os.mkdir(f"{path}\\inertia") mkdir(f"{path}\\inertia")
return config_file, current_file, prj_file return config_file, current_file, prj_file
else: else:
w2t('初始路径下不允许有文件夹,且初始路径下只能存在如下三个文件,确认后重新运行!', 0, 0, 'red', tab_name='Automatic Test') w2t('初始路径下不允许有文件夹,且初始路径下只能存在如下三个文件,确认后重新运行!', 0, 0, 'red', tab_name)
w2t(' 1. configs.xlsx\n 2. T_电机电流.xlsx\n 3. xxxx.zip', 0, 1, 'red', tab_name='Automatic Test') w2t(' 1. configs.xlsx\n 2. T_电机电流.xlsx\n 3. xxxx.zip', 0, 1, 'red', tab_name)
def prj_to_xcore(prj_file):
ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect('192.168.0.160', 22, username='luoshi', password='luoshi2019')
sftp = ssh.open_sftp()
sftp.put(prj_file, '/tmp/target.zip')
cmd = 'cd /tmp; rm -rf target/; mkdir target; unzip -d target/ -q target.zip; '
cmd += 'chmod 777 -R target/; rm target.zip'
ssh.exec_command(cmd)
cmd = 'sudo rm -rf /home/luoshi/bin/controller/projects/target; '
cmd += 'sudo mv /tmp/target/ /home/luoshi/bin/controller/projects/'
stdin, stdout, stderr = ssh.exec_command(cmd, get_pty=True)
stdin.write('luoshi2019' + '\n')
stdin.flush()
print(stdout.read().decode()) # 必须得输出一下stdout才能正确执行sudo
print(stderr.read().decode()) # 顺便也执行以下stderr
cmd = 'cd /home/luoshi/bin/controller/; '
cmd += 'sudo mv projects/target/_build/*.prj projects/target/_build/target.prj'
stdin, stdout, stderr = ssh.exec_command(cmd, get_pty=True)
stdin.write('luoshi2019' + '\n')
stdin.flush()
print(stdout.read().decode()) # 必须得输出一下stdout才能正确执行sudo
print(stderr.read().decode()) # 顺便也执行以下stderr
ssh.close()
def execution(cmd, hr, w2t, **kwargs):
_id = hr.execution(cmd, **kwargs)
_msg = hr.get_from_id(_id)
if not _msg:
w2t(f"无法获取{_id}请求的响应信息", 0, 7, 'red', tab_name='Automatic Test')
else:
_response = loads(_msg)
if not _response:
w2t(f"无法获取{id}请求的响应信息", 0, 1, 'red', tab_name='Automatic Test')
return _response
def data_proc_regular(path, filename, channel, scenario_time): def data_proc_regular(path, filename, channel, scenario_time):
@ -129,9 +76,9 @@ def data_proc_regular(path, filename, channel, scenario_time):
elif item.get('channel', None) == channel and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == channel and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq['device_servo_trq_feedback'].extend(item['value']) _d2d_trq['device_servo_trq_feedback'].extend(item['value'])
df1 = pandas.DataFrame.from_dict(_d2d_vel) df1 = DataFrame.from_dict(_d2d_vel)
df2 = pandas.DataFrame.from_dict(_d2d_trq) df2 = DataFrame.from_dict(_d2d_trq)
df = pandas.concat([df1, df2], axis=1) df = concat([df1, df2], axis=1)
_filename = f'{path}\\single\\j{channel+1}_single_{time()}.data' _filename = f'{path}\\single\\j{channel+1}_single_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
elif channel in list(range(6, 9)): elif channel in list(range(6, 9)):
@ -181,39 +128,39 @@ def data_proc_regular(path, filename, channel, scenario_time):
elif item.get('channel', None) == 5 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == 5 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq_5['device_servo_trq_feedback'].extend(item['value']) _d2d_trq_5['device_servo_trq_feedback'].extend(item['value'])
df_01 = pandas.DataFrame.from_dict(_d2d_vel_0) df_01 = DataFrame.from_dict(_d2d_vel_0)
df_02 = pandas.DataFrame.from_dict(_d2d_trq_0) df_02 = DataFrame.from_dict(_d2d_trq_0)
df = pandas.concat([df_01, df_02], axis=1) df = concat([df_01, df_02], axis=1)
_filename = f'{path}\\s_{channel-5}\\j1_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j1_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = pandas.DataFrame.from_dict(_d2d_vel_1) df_01 = DataFrame.from_dict(_d2d_vel_1)
df_02 = pandas.DataFrame.from_dict(_d2d_trq_1) df_02 = DataFrame.from_dict(_d2d_trq_1)
df = pandas.concat([df_01, df_02], axis=1) df = concat([df_01, df_02], axis=1)
_filename = f'{path}\\s_{channel-5}\\j2_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j2_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = pandas.DataFrame.from_dict(_d2d_vel_2) df_01 = DataFrame.from_dict(_d2d_vel_2)
df_02 = pandas.DataFrame.from_dict(_d2d_trq_2) df_02 = DataFrame.from_dict(_d2d_trq_2)
df = pandas.concat([df_01, df_02], axis=1) df = concat([df_01, df_02], axis=1)
_filename = f'{path}\\s_{channel-5}\\j3_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j3_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = pandas.DataFrame.from_dict(_d2d_vel_3) df_01 = DataFrame.from_dict(_d2d_vel_3)
df_02 = pandas.DataFrame.from_dict(_d2d_trq_3) df_02 = DataFrame.from_dict(_d2d_trq_3)
df = pandas.concat([df_01, df_02], axis=1) df = concat([df_01, df_02], axis=1)
_filename = f'{path}\\s_{channel-5}\\j4_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j4_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = pandas.DataFrame.from_dict(_d2d_vel_4) df_01 = DataFrame.from_dict(_d2d_vel_4)
df_02 = pandas.DataFrame.from_dict(_d2d_trq_4) df_02 = DataFrame.from_dict(_d2d_trq_4)
df = pandas.concat([df_01, df_02], axis=1) df = concat([df_01, df_02], axis=1)
_filename = f'{path}\\s_{channel-5}\\j5_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j5_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = pandas.DataFrame.from_dict(_d2d_vel_5) df_01 = DataFrame.from_dict(_d2d_vel_5)
df_02 = pandas.DataFrame.from_dict(_d2d_trq_5) df_02 = DataFrame.from_dict(_d2d_trq_5)
df = pandas.concat([df_01, df_02], axis=1) df = concat([df_01, df_02], axis=1)
_filename = f'{path}\\s_{channel-5}\\j6_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j6_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
elif channel in list(range(9, 15)): elif channel in list(range(9, 15)):
@ -221,7 +168,7 @@ def data_proc_regular(path, filename, channel, scenario_time):
lines = f_obj.readlines() lines = f_obj.readlines()
_d2d_vel = {'hw_joint_vel_feedback': []} _d2d_vel = {'hw_joint_vel_feedback': []}
_d2d_trq = {'device_servo_trq_feedback': []} _d2d_trq = {'device_servo_trq_feedback': []}
for line in lines: for line in lines[-300:]:
data = eval(line.strip())['data'] data = eval(line.strip())['data']
for item in data: for item in data:
try: try:
@ -233,9 +180,9 @@ def data_proc_regular(path, filename, channel, scenario_time):
elif item.get('channel', None) == channel-9 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == channel-9 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq['device_servo_trq_feedback'].extend(item['value']) _d2d_trq['device_servo_trq_feedback'].extend(item['value'])
df1 = pandas.DataFrame.from_dict(_d2d_vel) df1 = DataFrame.from_dict(_d2d_vel)
df2 = pandas.DataFrame.from_dict(_d2d_trq) df2 = DataFrame.from_dict(_d2d_trq)
df = pandas.concat([df1, df2], axis=1) df = concat([df1, df2], axis=1)
_filename = f'{path}\\single\\j{channel-8}_hold_{time()}.data' _filename = f'{path}\\single\\j{channel-8}_hold_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
@ -257,9 +204,9 @@ def data_proc_inertia(path, filename, channel):
elif item.get('channel', None) == channel+3 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == channel+3 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq['device_servo_trq_feedback'].extend(item['value']) _d2d_trq['device_servo_trq_feedback'].extend(item['value'])
df1 = pandas.DataFrame.from_dict(_d2d_vel) df1 = DataFrame.from_dict(_d2d_vel)
df2 = pandas.DataFrame.from_dict(_d2d_trq) df2 = DataFrame.from_dict(_d2d_trq)
df = pandas.concat([df1, df2], axis=1) df = concat([df1, df2], axis=1)
_filename = f'{path}\\inertia\\j{channel+4}_inertia_{time()}.data' _filename = f'{path}\\inertia\\j{channel+4}_inertia_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
@ -313,21 +260,21 @@ def run_rl(path, hr, md, loadsel, w2t):
disc = disc_inertia disc = disc_inertia
# preparation 触发软急停,并解除,目的是让可能正在运行着的机器停下来 # preparation 触发软急停,并解除,目的是让可能正在运行着的机器停下来
_response = execution('diagnosis.open', hr, w2t, open=True, display_open=True) clibs.execution('diagnosis.open', hr, w2t, tab_name, open=True, display_open=True)
_response = execution('diagnosis.set_params', hr, w2t, display_pdo_params=display_pdo_params) clibs.execution('diagnosis.set_params', hr, w2t, tab_name, display_pdo_params=display_pdo_params)
# _response = execution('diagnosis.save', hr, w2t, save=True) # 这条命令有问题 # clibs.execution('diagnosis.save', hr, w2t, tab_name, save=True) # 这条命令有问题
md.trigger_estop() md.trigger_estop()
md.reset_estop() md.reset_estop()
for condition in conditions: for condition in conditions:
number = conditions.index(condition) number = conditions.index(condition)
w2t(f"正在执行{disc[number][0]}测试......", 0, 0, 'purple', 'Automatic Test') w2t(f"正在执行{disc[number][0]}测试......", 0, 0, 'purple', tab_name)
# 1. 将act重置为False并修改未要执行的场景 # 1. 将act重置为False并修改未要执行的场景
md.write_act(False) md.write_act(False)
ssh = SSHClient() ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy()) ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect('192.168.0.160', 22, username='luoshi', password='luoshi2019') ssh.connect(clibs.ip_addr, 22, username='luoshi', password='luoshi2019')
cmd = 'cd /home/luoshi/bin/controller/; ' cmd = 'cd /home/luoshi/bin/controller/; '
cmd += 'sudo sed -i "/scenario/d" projects/target/_build/current/main.mod; ' cmd += 'sudo sed -i "/scenario/d" projects/target/_build/current/main.mod; '
cmd += f'sudo sed -i "/DONOTDELETE/i {condition}" projects/target/_build/current/main.mod' cmd += f'sudo sed -i "/DONOTDELETE/i {condition}" projects/target/_build/current/main.mod'
@ -339,13 +286,13 @@ def run_rl(path, hr, md, loadsel, w2t):
# 2. reload工程后pp2main并且自动模式和上电 # 2. reload工程后pp2main并且自动模式和上电
prj_path = 'target/_build/target.prj' prj_path = 'target/_build/target.prj'
_response = execution('overview.reload', hr, w2t, prj_path=prj_path, tasks=['current']) clibs.execution('overview.reload', hr, w2t, tab_name, prj_path=prj_path, tasks=['current'])
_response = execution('rl_task.pp_to_main', hr, w2t, tasks=['current']) clibs.execution('rl_task.pp_to_main', hr, w2t, tab_name, tasks=['current'])
_response = execution('state.switch_auto', hr, w2t) clibs.execution('state.switch_auto', hr, w2t, tab_name)
_response = execution('state.switch_motor_on', hr, w2t) clibs.execution('state.switch_motor_on', hr, w2t, tab_name)
# 3. 开始运行程序单轴运行35s # 3. 开始运行程序单轴运行35s
_response = execution('rl_task.run', hr, w2t, tasks=['current']) clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['current'])
_t_start = time() _t_start = time()
while True: while True:
if md.read_ready_to_go() == 1: if md.read_ready_to_go() == 1:
@ -353,7 +300,7 @@ def run_rl(path, hr, md, loadsel, w2t):
break break
else: else:
if (time() - _t_start) // 20 > 1: if (time() - _t_start) // 20 > 1:
w2t("20s内未收到机器人的运行信号需要确认RL程序编写正确并正常执行...", 0, 111, 'red', 'Automatic Test') w2t("20s内未收到机器人的运行信号需要确认RL程序编写正确并正常执行...", 0, 111, 'red', tab_name)
else: else:
sleep(1) sleep(1)
@ -369,11 +316,11 @@ def run_rl(path, hr, md, loadsel, w2t):
while True: while True:
scenario_time = md.read_scenario_time() scenario_time = md.read_scenario_time()
if float(scenario_time) > 1: if float(scenario_time) > 1:
w2t(f"场景{number-5}的周期时间:{scenario_time}", 0, 0, 'green', 'Automatic Test') w2t(f"场景{number-5}的周期时间:{scenario_time}", 0, 0, 'green', tab_name)
break break
else: else:
if (time()-_t_start)//60 > 3: if (time()-_t_start)//60 > 3:
w2t(f"未收到场景{number-5}的周期时间需要确认RL程序编写正确并正常执行...", 0, 111, 'red', 'Automatic Test') w2t(f"未收到场景{number-5}的周期时间需要确认RL程序编写正确并正常执行...", 0, 111, 'red', tab_name)
else: else:
sleep(5) sleep(5)
sleep(1) # 一定要延迟一秒再读一次scenario time寄存器因为一开始读取的数值不准确 sleep(1) # 一定要延迟一秒再读一次scenario time寄存器因为一开始读取的数值不准确
@ -381,27 +328,27 @@ def run_rl(path, hr, md, loadsel, w2t):
sleep(float(scenario_time)*0.2) # 再运行周期的20%即可 sleep(float(scenario_time)*0.2) # 再运行周期的20%即可
# 5.停止程序运行,保留数据并处理输出 # 5.停止程序运行,保留数据并处理输出
_response = execution('rl_task.stop', hr, w2t, tasks=['current']) clibs.execution('rl_task.stop', hr, w2t, tab_name, tasks=['current'])
_c_msg = hr.c_msg.copy() _c_msg = hr.c_msg.copy()
for _msg in _c_msg: for _msg in _c_msg:
if 'diagnosis.result' in _msg: if 'diagnosis.result' in _msg:
disc[number][1].insert(0, loads(_msg)) disc[number][1].insert(0, loads(_msg))
else: else:
hr.c_msg_xs.clear() hr.c_msg_xs.clear()
if len(hr.c_msg) > 240: if len(hr.c_msg) > 270:
del hr.c_msg[240:] del hr.c_msg[270:]
gen_result_file(path, loadsel, disc, number, scenario_time) gen_result_file(path, loadsel, disc, number, scenario_time)
else: else:
if loadsel == 'tool100': if loadsel == 'tool100':
w2t("单轴和场景电机电流采集完毕,如需采集惯量负载,须切换负载类型,并更换惯量负载,重新执行。", 0, 0, 'green', 'Automatic Test') w2t("单轴和场景电机电流采集完毕,如需采集惯量负载,须切换负载类型,并更换惯量负载,重新执行。", 0, 0, 'green', tab_name)
elif loadsel == 'inertia': elif loadsel == 'inertia':
w2t("惯量负载电机电流采集完毕,如需采集单轴/场景/保持电机电流,须切换负载类型,并更换偏置负载,重新执行。", 0, 0, 'green', 'Automatic Test') w2t("惯量负载电机电流采集完毕,如需采集单轴/场景/保持电机电流,须切换负载类型,并更换偏置负载,重新执行。", 0, 0, 'green', tab_name)
def main(path, hr, md, loadsel, w2t): def main(path, hr, md, loadsel, w2t):
data_dirs, data_files = traversal_files(path, w2t) data_dirs, data_files = clibs.traversal_files(path, w2t)
config_file, current_file, prj_file = check_files(path, loadsel, data_dirs, data_files, w2t) config_file, current_file, prj_file = check_files(path, loadsel, data_dirs, data_files, w2t)
prj_to_xcore(prj_file) clibs.prj_to_xcore(prj_file)
run_rl(path, hr, md, loadsel, w2t) run_rl(path, hr, md, loadsel, w2t)

131
aio/code/commons/clibs.py Normal file
View File

@ -0,0 +1,131 @@
from os import scandir
from threading import Thread
from time import sleep
from os.path import exists
from paramiko import SSHClient, AutoAddPolicy
from socket import setdefaulttimeout
from logging import DEBUG, INFO, WARNING, ERROR, CRITICAL, Formatter, StreamHandler, basicConfig
from concurrent_log_handler import ConcurrentRotatingFileHandler
ip_addr = '192.168.0.160'
RADIAN = 57.3 # 180 / 3.1415926
MAX_FRAME_SIZE = 1024
TIMEOUT = 5
setdefaulttimeout(TIMEOUT)
tab_names = {'dp': 'Data Process', 'at': 'Automatic Test', 'da': 'Duration Action', 'op': 'openapi'}
# PREFIX = '' # for pyinstaller packaging
PREFIX = '../assets/' # for source code debug
log_data = f'{PREFIX}templates/c_msg.log'
heartbeat = f'{PREFIX}templates/heartbeat'
durable_data_current_xlsx = f'{PREFIX}templates/durable/durable_data_current.xlsx'
durable_data_current_max_xlsx = f'{PREFIX}templates/durable/durable_data_current_max.xlsx'
durable_data_current = {
'time': list(range(1, 19)),
'axis1': [0 for _ in range(18)],
'axis2': [0 for _ in range(18)],
'axis3': [0 for _ in range(18)],
'axis4': [0 for _ in range(18)],
'axis5': [0 for _ in range(18)],
'axis6': [0 for _ in range(18)],
}
durable_data_current_max = {
'time': list(range(1, 19)),
'axis1': [0 for _ in range(18)],
'axis2': [0 for _ in range(18)],
'axis3': [0 for _ in range(18)],
'axis4': [0 for _ in range(18)],
'axis5': [0 for _ in range(18)],
'axis6': [0 for _ in range(18)],
}
fmt = Formatter('%(asctime)s # %(levelname)s-%(filename)s-%(funcName)s # %(message)s')
# file_handler = logging.FileHandler(log_data)
# file_handler = RotatingFileHandler(filename=log_data, backupCount=10, maxBytes=50*1024*1024, encoding='utf-8')
file_handler = ConcurrentRotatingFileHandler(filename=log_data, backupCount=10, maxBytes=50*1024*1024, encoding='utf-8')
file_handler.setFormatter(fmt)
file_handler.setLevel(INFO)
console_handler = StreamHandler()
console_handler.setFormatter(fmt)
console_handler.setLevel(ERROR)
# basicConfig(level=WARNING, # for product
basicConfig(level=WARNING,
datefmt='%Y-%m-%dT%H:%M:%S',
# handlers=[file_handler]) # for product
handlers=[file_handler, console_handler])
class GetThreadResult(Thread):
def __init__(self, func, args=()):
super(GetThreadResult, self).__init__()
self.func = func
self.args = args
self.result = 0
def run(self):
sleep(1)
self.result = self.func(*self.args)
def get_result(self):
Thread.join(self) # 等待线程执行完毕
try:
return self.result
except Exception as Err:
return None
def traversal_files(path, w2t):
# 功能:以列表的形式分别返回指定路径下的文件和文件夹,不包含子目录
# 参数:路径
# 返回值:路径下的文件夹列表 路径下的文件列表
if not exists(path):
msg = f'数据文件夹{path}不存在,请确认后重试......'
w2t(msg, 0, 1, 'red')
else:
dirs = []
files = []
for item in scandir(path):
if item.is_dir():
dirs.append(item.path)
elif item.is_file():
files.append(item.path)
return dirs, files
def prj_to_xcore(prj_file):
ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect(ip_addr, 22, username='luoshi', password='luoshi2019')
sftp = ssh.open_sftp()
sftp.put(prj_file, '/tmp/target.zip')
cmd = 'cd /tmp; rm -rf target/; mkdir target; unzip -d target/ -q target.zip; '
cmd += 'chmod 777 -R target/; rm target.zip'
ssh.exec_command(cmd)
cmd = 'sudo rm -rf /home/luoshi/bin/controller/projects/target; '
cmd += 'sudo mv /tmp/target/ /home/luoshi/bin/controller/projects/'
stdin, stdout, stderr = ssh.exec_command(cmd, get_pty=True)
stdin.write('luoshi2019' + '\n')
stdin.flush()
print(stdout.read().decode()) # 必须得输出一下stdout才能正确执行sudo
print(stderr.read().decode()) # 顺便也执行以下stderr
cmd = 'cd /home/luoshi/bin/controller/; '
cmd += 'sudo chmod -R 755 projects; rm /tmp/*.prj; sudo mv projects/target/_build/*.prj /tmp; cd /tmp; '
cmd += 'prj=($(ls *.prj)); sudo mv ${prj[0]} /home/luoshi/bin/controller/projects/target/_build/target.prj; '
stdin, stdout, stderr = ssh.exec_command(cmd, get_pty=True)
stdin.write('luoshi2019' + '\n')
stdin.flush()
print(stdout.read().decode()) # 必须得输出一下stdout才能正确执行sudo
print(stderr.read().decode()) # 顺便也执行以下stderr
ssh.close()
def execution(cmd, hr, w2t, tab_name, **kwargs):
_id = hr.execution(cmd, **kwargs)
_msg = hr.get_from_id(_id)
if not _msg:
w2t(f"无法获取{_id}请求的响应信息", 0, 6, 'red', tab_name)
else:
return eval(_msg.split('#')[2])

View File

@ -1,25 +1,22 @@
from json import load, dumps, loads from json import load, dumps, loads
from socket import socket, setdefaulttimeout, AF_INET, SOCK_STREAM from socket import socket, AF_INET, SOCK_STREAM
from threading import Thread from threading import Thread
import selectors import selectors
from time import time, sleep from time import time, sleep
from os.path import dirname
from pymodbus.client.tcp import ModbusTcpClient from pymodbus.client.tcp import ModbusTcpClient
from pymodbus.payload import BinaryPayloadDecoder, BinaryPayloadBuilder from pymodbus.payload import BinaryPayloadDecoder, BinaryPayloadBuilder
from pymodbus.constants import Endian from pymodbus.constants import Endian
from logging import getLogger
from commons import clibs
MAX_FRAME_SIZE = 1024 logger = getLogger(__file__)
setdefaulttimeout(2)
current_path = dirname(__file__)
heartbeat = f'{current_path}/../assets/templates/heartbeat'
class ModbusRequest(object): class ModbusRequest(object):
def __init__(self, w2t): def __init__(self, w2t):
super().__init__() super().__init__()
self.w2t = w2t self.w2t = w2t
self.tab_name = 'openapi' self.tab_name = 'openapi'
self.host = '192.168.0.160' self.host = clibs.ip_addr
self.port = 502 self.port = 502
self.interval = 0.3 self.interval = 0.3
self.c = ModbusTcpClient(self.host, self.port) self.c = ModbusTcpClient(self.host, self.port)
@ -195,21 +192,19 @@ class HmiRequest(object):
def sock_conn(self): def sock_conn(self):
# while True: # while True:
with open(heartbeat, "r", encoding='utf-8') as f_hb: with open(clibs.heartbeat, "r", encoding='utf-8') as f_hb:
c_state = f_hb.read().strip() c_state = f_hb.read().strip()
if c_state == '0': if c_state == '0':
try: try:
self.c = socket(AF_INET, SOCK_STREAM) self.c = socket(AF_INET, SOCK_STREAM)
self.c.connect(('192.168.0.160', 5050)) self.c.connect((clibs.ip_addr, 5050))
# self.c.connect(('192.168.84.129', 5050))
self.c.setblocking(False) self.c.setblocking(False)
self.c_xs = socket(AF_INET, SOCK_STREAM) self.c_xs = socket(AF_INET, SOCK_STREAM)
self.c_xs.connect(('192.168.0.160', 6666)) self.c_xs.connect((clibs.ip_addr, 6666))
# self.c_xs.connect(('192.168.84.129', 6666))
self.c_xs.setblocking(False) self.c_xs.setblocking(False)
self.w2t("Connection success", 0, 0, 'green', tab_name=self.tab_name) self.w2t("Connection success", 0, 0, 'green', tab_name=self.tab_name)
with open(heartbeat, "w", encoding='utf-8') as f_hb: with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb:
f_hb.write('1') f_hb.write('1')
md = ModbusRequest(self.w2t) md = ModbusRequest(self.w2t)
md.reset_estop() md.reset_estop()
@ -219,7 +214,7 @@ class HmiRequest(object):
md.write_axis(1) md.write_axis(1)
except Exception as Err: except Exception as Err:
self.w2t("Connection failed...", 0, 0, 'red', tab_name=self.tab_name) self.w2t("Connection failed...", 0, 0, 'red', tab_name=self.tab_name)
with open(heartbeat, "w", encoding='utf-8') as f_hb: with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb:
f_hb.write('0') f_hb.write('0')
def header_check(self, index, data): def header_check(self, index, data):
@ -247,33 +242,29 @@ class HmiRequest(object):
# print(f"in head check data: {data}") # print(f"in head check data: {data}")
self.broke = 100 self.broke = 100
index += MAX_FRAME_SIZE index += clibs.MAX_FRAME_SIZE
return index, 0, 0 return index, 0, 0
def heartbeat(self): def heartbeat(self):
while self.t_bool: while self.t_bool:
_id = self.execution('controller.heart') _id = self.execution('controller.heart')
_flag = '0' if self.get_from_id(_id) is None else '1' _flag = '0' if self.get_from_id(_id) is None else '1'
print(f"hb = {_flag}", end=' ') # print(f"hb = {_flag}", end=' ')
print(f"len(c_msg) = {len(self.c_msg)}", end=' ') # print(f"len(c_msg) = {len(self.c_msg)}", end=' ')
print(f"len(c_msg_xs) = {len(self.c_msg_xs)}", end='\n') # print(f"len(c_msg_xs) = {len(self.c_msg_xs)}", end='\n')
with open(heartbeat, "w", encoding='utf-8') as f_hb: with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb:
f_hb.write(_flag) f_hb.write(_flag)
if _flag == '0': if _flag == '0':
self.w2t(f"{_id} 心跳丢失,连接失败,重新连接中...", 0, 7, 'red', tab_name=self.tab_name) self.w2t(f"{_id} 心跳丢失,连接失败,重新连接中...", 0, 7, 'red', tab_name=self.tab_name)
t = time() sleep(2)
with open(f"{current_path}/../assets/templates/c_msg.log", "w", encoding='utf-8') as f:
for msg in self.c_msg:
f.write(str(t) + '-' + str(loads(msg)) + '\n')
sleep(1)
def msg_storage(self, response, flag=0): def msg_storage(self, response, flag=0):
# response是解码后的字符串 # response是解码后的字符串
messages = self.c_msg if flag == 0 else self.c_msg_xs messages = self.c_msg if flag == 0 else self.c_msg_xs
logger.warning(f"{loads(response)}")
if 'move.monitor' in response: if 'move.monitor' in response:
pass pass
elif len(messages) < 20000: elif len(messages) < 10000:
messages.insert(0, response) messages.insert(0, response)
else: else:
messages.insert(0, response) messages.insert(0, response)
@ -317,7 +308,7 @@ class HmiRequest(object):
# print(f"broke == 0 index = {self.index-8}") # print(f"broke == 0 index = {self.index-8}")
# print(f"broke == 0 INIT pkg size = {self.pkg_size}") # print(f"broke == 0 INIT pkg size = {self.pkg_size}")
# print(f"broke == 0 data = {data}") # print(f"broke == 0 data = {data}")
if self.index > MAX_FRAME_SIZE: if self.index > clibs.MAX_FRAME_SIZE:
break break
# 详见解包原理数据.txtself.pkg_size 永远是除了当前data之外剩余未处理的数据大小 # 详见解包原理数据.txtself.pkg_size 永远是除了当前data之外剩余未处理的数据大小
if self.pkg_size <= len(data) - self.index: if self.pkg_size <= len(data) - self.index:
@ -361,15 +352,15 @@ class HmiRequest(object):
# print('flag = 0 encounter broke == 3') # print('flag = 0 encounter broke == 3')
self.broke = 3 self.broke = 3
self.index += MAX_FRAME_SIZE self.index += clibs.MAX_FRAME_SIZE
self.reset_index = 1 self.reset_index = 1
break # 因为 index + 2 的大小超过 MAX_FRAME_SIZE break # 因为 index + 2 的大小超过 clibs.MAX_FRAME_SIZE
elif self.index+_frame_size-6 > len(data): elif self.index+_frame_size-6 > len(data):
self.response = data[self.index:].decode() self.response = data[self.index:].decode()
self.pkg_size -= (len(data) - self.index) # 详见解包原理数据.txtself.pkg_size self.pkg_size -= (len(data) - self.index) # 详见解包原理数据.txtself.pkg_size
self.leftover = (_frame_size-6-(len(data)-self.index)) self.leftover = (_frame_size-6-(len(data)-self.index))
self.index += MAX_FRAME_SIZE self.index += clibs.MAX_FRAME_SIZE
self.reset_index = 1 self.reset_index = 1
# print(f"in flag=0 else data = {data}") # print(f"in flag=0 else data = {data}")
@ -438,9 +429,9 @@ class HmiRequest(object):
# print('flag = 1 encounter broke == 3') # print('flag = 1 encounter broke == 3')
self.broke = 3 self.broke = 3
self.index += MAX_FRAME_SIZE self.index += clibs.MAX_FRAME_SIZE
self.reset_index = 1 self.reset_index = 1
break # 因为 index + 2 的大小超过 MAX_FRAME_SIZE break # 因为 index + 2 的大小超过 clibs.MAX_FRAME_SIZE
# print(f"in pkg size > 0 loop after if data = {data}") # print(f"in pkg size > 0 loop after if data = {data}")
# print(f"in pkg size > 0 loop after if index = {self.index}") # print(f"in pkg size > 0 loop after if index = {self.index}")
# print(f"in pkg size > 0 loop after if pkg size = {self.pkg_size}") # print(f"in pkg size > 0 loop after if pkg size = {self.pkg_size}")
@ -457,7 +448,7 @@ class HmiRequest(object):
self.response += data[self.index:].decode() self.response += data[self.index:].decode()
self.leftover -= (len(data) - self.index) self.leftover -= (len(data) - self.index)
self.pkg_size -= (len(data) - self.index) self.pkg_size -= (len(data) - self.index)
self.index += MAX_FRAME_SIZE self.index += clibs.MAX_FRAME_SIZE
self.reset_index = 1 self.reset_index = 1
# print(f"in pkg size > 0 loop after else data = {data}") # print(f"in pkg size > 0 loop after else data = {data}")
# print(f"in pkg size > 0 loop after else index = {self.index}") # print(f"in pkg size > 0 loop after else index = {self.index}")
@ -511,15 +502,21 @@ class HmiRequest(object):
def get_from_id(self, msg_id, flag=0): def get_from_id(self, msg_id, flag=0):
for i in range(3): for i in range(3):
messages = self.c_msg if flag == 0 else self.c_msg_xs with open(clibs.log_data, mode='r', encoding='utf-8') as f_log:
for msg in messages: for line in f_log:
if msg_id is None: if msg_id in line.strip():
return None return line
elif msg_id in msg:
return msg
sleep(1) sleep(1)
else: else: # 尝试在上一次分割的日志中查找,只做一次
return None sleep(1)
try:
with open(clibs.log_data+'.1', mode='r', encoding='utf-8') as f_log:
for line in f_log:
if msg_id in line.strip():
return line
except FileNotFoundError:
pass
return None
def package(self, cmd): def package(self, cmd):
_frame_head = (len(cmd) + 6).to_bytes(length=2, byteorder='big') _frame_head = (len(cmd) + 6).to_bytes(length=2, byteorder='big')
@ -533,7 +530,7 @@ class HmiRequest(object):
def unpackage(self, sock): def unpackage(self, sock):
def to_read(conn, mask): def to_read(conn, mask):
data = conn.recv(MAX_FRAME_SIZE) data = conn.recv(clibs.MAX_FRAME_SIZE)
if data: if data:
# print(data) # print(data)
self.get_response(data) self.get_response(data)
@ -580,7 +577,7 @@ class HmiRequest(object):
if flg == 0: # for old protocols if flg == 0: # for old protocols
req = None req = None
try: try:
with open(f'{current_path}/../assets/templates/json/{command}.json', encoding='utf-8', with open(f'{clibs.PREFIX}templates/json/{command}.json', encoding='utf-8',
mode='r') as f_json: mode='r') as f_json:
req = load(f_json) req = load(f_json)
except: except:

View File

@ -1 +0,0 @@
__all__ = ['brake', 'current', 'iso', 'wavelogger']

View File

@ -1,49 +1,14 @@
# coding: utf-8 # coding: utf-8
from os import scandir from os.path import isfile
from os.path import isfile, exists
from sys import argv from sys import argv
from openpyxl import load_workbook from openpyxl import load_workbook
from time import time, sleep, strftime, localtime from time import time, sleep, strftime, localtime
from threading import Thread from threading import Thread
from pandas import read_csv from pandas import read_csv
from logging import getLogger
from commons import clibs
logger = getLogger(__file__)
class GetThreadResult(Thread):
def __init__(self, func, args=()):
super(GetThreadResult, self).__init__()
self.func = func
self.args = args
self.result = 0
def run(self):
sleep(1)
self.result = self.func(*self.args)
def get_result(self):
Thread.join(self) # 等待线程执行完毕
try:
return self.result
except Exception as Err:
return None
def traversal_files(path, w2t):
# 功能:以列表的形式分别返回指定路径下的文件和文件夹,不包含子目录
# 参数:路径
# 返回值:路径下的文件夹列表 路径下的文件列表
if not exists(path):
msg = f'数据文件夹{path}不存在,请确认后重试......'
w2t(msg, 0, 1, 'red')
else:
dirs = []
files = []
for item in scandir(path):
if item.is_dir():
dirs.append(item.path)
elif item.is_file():
files.append(item.path)
return dirs, files
def check_files(path, raw_data_dirs, result_files, w2t): def check_files(path, raw_data_dirs, result_files, w2t):
@ -83,7 +48,7 @@ def check_files(path, raw_data_dirs, result_files, w2t):
规则解释AA/BB/CC 指的是臂展/负载/速度的比例例如reach66_load100_speed3366%臂展100%负载以及33%速度情况下的测试结果文件夹""" 规则解释AA/BB/CC 指的是臂展/负载/速度的比例例如reach66_load100_speed3366%臂展100%负载以及33%速度情况下的测试结果文件夹"""
w2t(msg, 0, 4, 'red') w2t(msg, 0, 4, 'red')
_, raw_data_files = traversal_files(raw_data_dir, w2t) _, raw_data_files = clibs.traversal_files(raw_data_dir, w2t)
if len(raw_data_files) != 3: if len(raw_data_files) != 3:
msg = f"数据目录 {raw_data_dir} 下数据文件个数错误,每个数据目录下有且只能有三个以 .data 为后缀的数据文件" msg = f"数据目录 {raw_data_dir} 下数据文件个数错误,每个数据目录下有且只能有三个以 .data 为后缀的数据文件"
w2t(msg, 0, 5, 'red') w2t(msg, 0, 5, 'red')
@ -109,6 +74,7 @@ def get_configs(configfile, w2t):
return av, rr return av, rr
def now_doing_msg(docs, flag, w2t): def now_doing_msg(docs, flag, w2t):
# 功能:输出正在处理的文件或目录 # 功能:输出正在处理的文件或目录
# 参数文件或目录start 或 done 标识 # 参数文件或目录start 或 done 标识
@ -228,7 +194,7 @@ def data_process(result_file, raw_data_dirs, av, rr, vel, trq, estop, w2t):
global stop global stop
stop = 0 stop = 0
t_excel = GetThreadResult(load_workbook, args=(result_file, )) t_excel = clibs.GetThreadResult(load_workbook, args=(result_file, ))
t_wait = Thread(target=w2t_local, args=('.', 1, w2t)) t_wait = Thread(target=w2t_local, args=('.', 1, w2t))
t_excel.start() t_excel.start()
t_wait.start() t_wait.start()
@ -242,7 +208,7 @@ def data_process(result_file, raw_data_dirs, av, rr, vel, trq, estop, w2t):
for raw_data_dir in raw_data_dirs: for raw_data_dir in raw_data_dirs:
if raw_data_dir.split('\\')[-1].split('_')[0] == prefix: if raw_data_dir.split('\\')[-1].split('_')[0] == prefix:
now_doing_msg(raw_data_dir, 'start', w2t) now_doing_msg(raw_data_dir, 'start', w2t)
_, data_files = traversal_files(raw_data_dir, w2t) _, data_files = clibs.traversal_files(raw_data_dir, w2t)
# 数据文件串行处理模式--------------------------------- # 数据文件串行处理模式---------------------------------
# count = 1 # count = 1
# for data_file in data_files: # for data_file in data_files:
@ -280,7 +246,7 @@ def main(path, vel, trq, estop, w2t):
# 参数initialization函数的返回值 # 参数initialization函数的返回值
# 返回值:- # 返回值:-
time_start = time() time_start = time()
raw_data_dirs, result_files = traversal_files(path, w2t) raw_data_dirs, result_files = clibs.traversal_files(path, w2t)
try: try:
# threads = [] # threads = []

View File

@ -1,31 +1,14 @@
from openpyxl import load_workbook from openpyxl import load_workbook
from os import scandir
from os.path import exists
from sys import argv from sys import argv
from pandas import read_csv, concat, set_option from pandas import read_csv, concat, set_option
from re import match from re import match
from threading import Thread from threading import Thread
from time import sleep from time import sleep
from csv import reader, writer from csv import reader, writer
from logging import getLogger
from commons import clibs
logger = getLogger(__file__)
class GetThreadResult(Thread):
def __init__(self, func, args=()):
super(GetThreadResult, self).__init__()
self.func = func
self.args = args
self.result = 0
def run(self):
sleep(1)
self.result = self.func(*self.args)
def get_result(self):
Thread.join(self) # 等待线程执行完毕
try:
return self.result
except Exception as Err:
return None
def w2t_local(msg, wait, w2t): def w2t_local(msg, wait, w2t):
@ -38,27 +21,8 @@ def w2t_local(msg, wait, w2t):
break break
def traversal_files(path, w2t):
# 功能:以列表的形式分别返回指定路径下的文件和文件夹,不包含子目录
# 参数:路径
# 返回值:路径下的文件夹列表 路径下的文件列表
if not exists(path):
msg = f'数据文件夹{path}不存在,请确认后重试......'
w2t(msg, 0, 8, 'red')
else:
dirs = []
files = []
for item in scandir(path):
if item.is_dir():
dirs.append(item.path)
elif item.is_file():
files.append(item.path)
return dirs, files
def initialization(path, sub, w2t): def initialization(path, sub, w2t):
_, data_files = traversal_files(path, w2t) _, data_files = clibs.traversal_files(path, w2t)
count = 0 count = 0
for data_file in data_files: for data_file in data_files:
@ -69,8 +33,8 @@ def initialization(path, sub, w2t):
count += 1 count += 1
else: else:
if not (match('j[1-7].*\\.data', filename) or match('j[1-7].*\\.csv', filename)): if not (match('j[1-7].*\\.data', filename) or match('j[1-7].*\\.csv', filename)):
print(f"不合规 {data_file}") msg = f"不合规 {data_file}\n"
msg = f"所有文件必须以 jx_ 开头,以 .data/csv 结尾x取值1-7请检查后重新运行。" msg += f"所有文件必须以 jx_ 开头,以 .data/csv 结尾x取值1-7请检查后重新运行。"
w2t(msg, 0, 6, 'red') w2t(msg, 0, 6, 'red')
if not ((sub == 'cycle' and count == 2) or (sub != 'cycle' and count == 1)): if not ((sub == 'cycle' and count == 2) or (sub != 'cycle' and count == 1)):
@ -174,7 +138,7 @@ def current_cycle(dur, data_files, rcs, rrs, vel, trq, trqh, rpms, w2t):
w2t(f"正在打开文件 {result},需要 10s 左右", 1, 0, 'orange') w2t(f"正在打开文件 {result},需要 10s 左右", 1, 0, 'orange')
global stop global stop
stop = 0 stop = 0
t_excel = GetThreadResult(load_workbook, args=(result, )) t_excel = clibs.GetThreadResult(load_workbook, args=(result, ))
t_wait = Thread(target=w2t_local, args=('.', 1, w2t)) t_wait = Thread(target=w2t_local, args=('.', 1, w2t))
t_excel.start() t_excel.start()
t_wait.start() t_wait.start()
@ -268,7 +232,7 @@ def p_single(wb, single, vel, trq, rpms, rrs, w2t):
col_names = list(df.columns) col_names = list(df.columns)
df_1 = df[col_names[vel-1]].multiply(rpm*addition) df_1 = df[col_names[vel-1]].multiply(rpm*addition)
df_2 = df[col_names[trq-1]].multiply(scale) df_2 = df[col_names[trq-1]].multiply(scale)
print(df_1.abs().max()) # print(df_1.abs().max())
df = concat([df_1, df_2], axis=1) df = concat([df_1, df_2], axis=1)
_step = 5 if data_file.endswith('.csv') else 50 _step = 5 if data_file.endswith('.csv') else 50

View File

@ -1,27 +1,12 @@
# _*_ encodingutf-8 _*_ # _*_ encodingutf-8 _*_
import pdfplumber import pdfplumber
from openpyxl import load_workbook from openpyxl import load_workbook
from os import scandir, remove from os import remove
from os.path import exists
from sys import argv from sys import argv
from logging import getLogger
from commons import clibs
logger = getLogger(__file__)
def traversal_files(path, w2t):
# 功能:以列表的形式分别返回指定路径下的文件和文件夹,不包含子目录
# 参数:路径
# 返回值:路径下的文件夹列表 路径下的文件列表
if not exists(path):
msg = f'数据文件夹{path}不存在,请确认后重试......'
w2t(msg, 0, 1, 'red')
else:
dirs = files = []
for item in scandir(path):
if item.is_dir():
dirs.append(item.path)
elif item.is_file():
files.append(item.path)
return dirs, files
def p_iso(file, p_files, ws, tmpfile): def p_iso(file, p_files, ws, tmpfile):
@ -153,7 +138,7 @@ def p_iso_1000(file, p_files, ws, tmpfile):
def main(path, w2t): def main(path, w2t):
dirs, files = traversal_files(path, 1) dirs, files = clibs.traversal_files(path, 1)
try: try:
wb = load_workbook(path + "/iso-results.xlsx") wb = load_workbook(path + "/iso-results.xlsx")

View File

@ -1,31 +1,11 @@
import os
import random
from pandas import read_csv from pandas import read_csv
from csv import reader from csv import reader
from sys import argv from sys import argv
from os.path import exists
from os import scandir, remove
from openpyxl import Workbook from openpyxl import Workbook
from random import randint from logging import getLogger
from commons import clibs
def traversal_files(path, w2t): logger = getLogger(__file__)
# 功能:以列表的形式分别返回指定路径下的文件和文件夹,不包含子目录
# 参数:路径
# 返回值:路径下的文件夹列表 路径下的文件列表
if not exists(path):
msg = f'数据文件夹{path}不存在,请确认后重试......'
w2t(msg, 0, 1, 'red')
else:
dirs = []
files = []
for item in scandir(path):
if item.is_dir():
dirs.append(item.path)
elif item.is_file():
files.append(item.path)
return dirs, files
def find_point(bof, step, pos, data_file, flag, df, row, w2t): def find_point(bof, step, pos, data_file, flag, df, row, w2t):
@ -95,7 +75,7 @@ def get_cycle_info(data_file, df, row, step, w2t):
def initialization(path, w2t): def initialization(path, w2t):
_, data_files = traversal_files(path, w2t) _, data_files = clibs.traversal_files(path, w2t)
for data_file in data_files: for data_file in data_files:
if not data_file.lower().endswith('.csv'): if not data_file.lower().endswith('.csv'):
@ -126,7 +106,7 @@ def single_file_proc(ws, data_file, df, low, high, cycle, w2t):
_step = 5 _step = 5
_data = {} _data = {}
row_max = df.index[-1]-100 row_max = df.index[-1]-100
print(data_file) # print(data_file)
while _row < row_max: while _row < row_max:
if count not in _data.keys(): if count not in _data.keys():
_data[count] = [] _data[count] = []
@ -149,7 +129,7 @@ def single_file_proc(ws, data_file, df, low, high, cycle, w2t):
ws.cell(row=1, column=i).value = f"{i-1}次测试" ws.cell(row=1, column=i).value = f"{i-1}次测试"
ws.cell(row=i, column=1).value = f"{i-1}次精度变化" ws.cell(row=i, column=1).value = f"{i-1}次精度变化"
print(_data) # print(_data)
for i in sorted(_data.keys()): for i in sorted(_data.keys()):
_row = 2 _row = 2
_column = i + 1 _column = i + 1
@ -162,9 +142,9 @@ def execution(data_files, w2t):
wb = Workbook() wb = Workbook()
for data_file in data_files: for data_file in data_files:
ws, df, low, high, cycle = preparation(data_file, wb, w2t) ws, df, low, high, cycle = preparation(data_file, wb, w2t)
print(f"low = {low}") # print(f"low = {low}")
print(f"high = {high}") # print(f"high = {high}")
print(f"cycle = {cycle}") # print(f"cycle = {cycle}")
single_file_proc(ws, data_file, df, low, high, cycle, w2t) single_file_proc(ws, data_file, df, low, high, cycle, w2t)
wd = data_files[0].split('\\') wd = data_files[0].split('\\')

View File

@ -1 +0,0 @@
__all__ = ['factory_test']

View File

@ -1,7 +1,4 @@
from sys import argv from sys import argv
from os.path import exists, dirname
from os import scandir
from paramiko import SSHClient, AutoAddPolicy
from json import loads from json import loads
from time import sleep, time, strftime, localtime from time import sleep, time, strftime, localtime
from pandas import DataFrame from pandas import DataFrame
@ -9,11 +6,12 @@ from openpyxl import load_workbook
from math import sqrt from math import sqrt
from numpy import power from numpy import power
from csv import writer from csv import writer
from logging import getLogger
from commons import clibs
tab_name = 'Durable Action' logger = getLogger(__file__)
tab_name = clibs.tab_names['da']
count = 0 count = 0
durable_data_current_xlsx = f'{dirname(__file__)}/../../assets/templates/durable/durable_data_current.xlsx'
durable_data_current_max_xlsx = f'{dirname(__file__)}/../../assets/templates/durable/durable_data_current_max.xlsx'
display_pdo_params = [ display_pdo_params = [
# {"name": "hw_joint_vel_feedback", "channel": 0}, # {"name": "hw_joint_vel_feedback", "channel": 0},
# {"name": "hw_joint_vel_feedback", "channel": 1}, # {"name": "hw_joint_vel_feedback", "channel": 1},
@ -34,22 +32,6 @@ title = [
] ]
def traversal_files(path, w2t):
if not exists(path):
msg = f'数据文件夹{path}不存在,请确认后重试......'
w2t(msg, 0, 1, 'red', tab_name=tab_name)
else:
dirs = []
files = []
for item in scandir(path):
if item.is_dir():
dirs.append(item.path)
elif item.is_file():
files.append(item.path)
return dirs, files
def check_files(data_dirs, data_files, w2t): def check_files(data_dirs, data_files, w2t):
if len(data_dirs) != 0 or len(data_files) != 2: if len(data_dirs) != 0 or len(data_files) != 2:
w2t('初始路径下不允许有文件夹,且初始路径下只能存在如下文件,确认后重新运行!\n1. target.zip\n2. configs.xlsx', 0, 10, 'red', tab_name) w2t('初始路径下不允许有文件夹,且初始路径下只能存在如下文件,确认后重新运行!\n1. target.zip\n2. configs.xlsx', 0, 10, 'red', tab_name)
@ -63,50 +45,10 @@ def check_files(data_dirs, data_files, w2t):
return data_files return data_files
def prj_to_xcore(prj_file):
ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect('192.168.0.160', 22, username='luoshi', password='luoshi2019')
sftp = ssh.open_sftp()
sftp.put(prj_file, '/tmp/target.zip')
cmd = 'cd /tmp; rm -rf target/; mkdir target; unzip -d target/ -q target.zip; '
cmd += 'chmod 777 -R target/; rm target.zip'
ssh.exec_command(cmd)
cmd = 'sudo rm -rf /home/luoshi/bin/controller/projects/target; '
cmd += 'sudo mv /tmp/target/ /home/luoshi/bin/controller/projects/'
stdin, stdout, stderr = ssh.exec_command(cmd, get_pty=True)
stdin.write('luoshi2019' + '\n')
stdin.flush()
print(stdout.read().decode()) # 必须得输出一下stdout才能正确执行sudo
print(stderr.read().decode()) # 顺便也执行以下stderr
cmd = 'cd /home/luoshi/bin/controller/; '
cmd += 'sudo mv projects/target/_build/*.prj projects/target/_build/target.prj'
stdin, stdout, stderr = ssh.exec_command(cmd, get_pty=True)
stdin.write('luoshi2019' + '\n')
stdin.flush()
print(stdout.read().decode()) # 必须得输出一下stdout才能正确执行sudo
print(stderr.read().decode()) # 顺便也执行以下stderr
ssh.close()
def execution(cmd, hr, w2t, **kwargs):
_id = hr.execution(cmd, **kwargs)
_msg = hr.get_from_id(_id)
if not _msg:
w2t(f"无法获取{_id}请求的响应信息", 0, 7, 'red', tab_name=tab_name)
else:
_response = loads(_msg)
if not _response:
w2t(f"无法获取{id}请求的响应信息", 0, 1, 'red', tab_name=tab_name)
return _response
def run_rl(path, config_file, data_all, hr, md, w2t): def run_rl(path, config_file, data_all, hr, md, w2t):
# 1. 关闭诊断曲线,触发软急停,并解除,目的是让可能正在运行着的机器停下来,切手动模式并下电 # 1. 关闭诊断曲线,触发软急停,并解除,目的是让可能正在运行着的机器停下来,切手动模式并下电
_response = execution('diagnosis.open', hr, w2t, open=True, display_open=True) clibs.execution('diagnosis.open', hr, w2t, tab_name, open=True, display_open=True)
_response = execution('diagnosis.set_params', hr, w2t, display_pdo_params=display_pdo_params) clibs.execution('diagnosis.set_params', hr, w2t, tab_name, display_pdo_params=display_pdo_params)
md.trigger_estop() md.trigger_estop()
md.reset_estop() md.reset_estop()
md.write_act(False) md.write_act(False)
@ -114,13 +56,13 @@ def run_rl(path, config_file, data_all, hr, md, w2t):
# 2. reload工程后pp2main并且自动模式和上电 # 2. reload工程后pp2main并且自动模式和上电
prj_path = 'target/_build/target.prj' prj_path = 'target/_build/target.prj'
_response = execution('overview.reload', hr, w2t, prj_path=prj_path, tasks=['current']) clibs.execution('overview.reload', hr, w2t, tab_name, prj_path=prj_path, tasks=['current'])
_response = execution('rl_task.pp_to_main', hr, w2t, tasks=['current']) clibs.execution('rl_task.pp_to_main', hr, w2t, tab_name, tasks=['current'])
_response = execution('state.switch_auto', hr, w2t) clibs.execution('state.switch_auto', hr, w2t, tab_name)
_response = execution('state.switch_motor_on', hr, w2t) clibs.execution('state.switch_motor_on', hr, w2t, tab_name)
# 3. 开始运行程序 # 3. 开始运行程序
_response = execution('rl_task.run', hr, w2t, tasks=['current']) clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['current'])
_t_start = time() _t_start = time()
while True: while True:
if md.read_ready_to_go() == 1: if md.read_ready_to_go() == 1:
@ -178,8 +120,8 @@ def get_durable_data(path, data, scenario_time, wait_time, rcs, hr, md, w2t):
_data_list.insert(0, loads(_msg)) _data_list.insert(0, loads(_msg))
else: else:
hr.c_msg_xs.clear() hr.c_msg_xs.clear()
if len(hr.c_msg) > 240: if len(hr.c_msg) > 270:
del hr.c_msg[240:] del hr.c_msg[270:]
# with open(f'{path}\\log.txt', 'w', encoding='utf-8') as f_obj: # with open(f'{path}\\log.txt', 'w', encoding='utf-8') as f_obj:
# for _ in _data_list: # for _ in _data_list:
@ -253,8 +195,8 @@ def get_durable_data(path, data, scenario_time, wait_time, rcs, hr, md, w2t):
while True: while True:
if not hr.durable_lock: if not hr.durable_lock:
hr.durable_lock = 1 hr.durable_lock = 1
_df_1.to_excel(durable_data_current_xlsx, index=False) _df_1.to_excel(clibs.durable_data_current_xlsx, index=False)
_df_2.to_excel(durable_data_current_max_xlsx, index=False) _df_2.to_excel(clibs.durable_data_current_max_xlsx, index=False)
hr.durable_lock = 0 hr.durable_lock = 0
break break
else: else:
@ -272,28 +214,10 @@ def get_durable_data(path, data, scenario_time, wait_time, rcs, hr, md, w2t):
def main(path, hr, md, w2t): def main(path, hr, md, w2t):
durable_data_current = { data_all = [clibs.durable_data_current, clibs.durable_data_current_max]
'time': list(range(1, 19)), data_dirs, data_files = clibs.traversal_files(path, w2t)
'axis1': [0 for _ in range(18)],
'axis2': [0 for _ in range(18)],
'axis3': [0 for _ in range(18)],
'axis4': [0 for _ in range(18)],
'axis5': [0 for _ in range(18)],
'axis6': [0 for _ in range(18)],
}
durable_data_current_max = {
'time': list(range(1, 19)),
'axis1': [0 for _ in range(18)],
'axis2': [0 for _ in range(18)],
'axis3': [0 for _ in range(18)],
'axis4': [0 for _ in range(18)],
'axis5': [0 for _ in range(18)],
'axis6': [0 for _ in range(18)],
}
data_all = [durable_data_current, durable_data_current_max]
data_dirs, data_files = traversal_files(path, w2t)
config_file, prj_file = check_files(data_dirs, data_files, w2t) config_file, prj_file = check_files(data_dirs, data_files, w2t)
prj_to_xcore(prj_file) clibs.prj_to_xcore(prj_file)
run_rl(path, config_file, data_all, hr, md, w2t) run_rl(path, config_file, data_all, hr, md, w2t)