19 Commits

Author SHA1 Message Date
ea56dfce52 last commit 2025-02-25 10:51:22 +08:00
88776c6794 fix: 电机电流异常情况修复 2024-12-30 09:04:24 +08:00
14f6d43027 1. [main: do_brake.py] 修改了 SSH 的固定 IP 为 clibs 中读取的内容,并删除了每次都 reload 工程的动作,改为只在修改 RL 工程时 reload 一次,旨在减少最近频繁出现的“无法获取overview.reload-xxxxxx”请求的响应,初步判断是 xCore 的问题,非 AIO 问题,已反馈待版本修复
2. [main: wavelogger.py] 新增异常数据校验功能
2024-12-16 13:51:04 +08:00
4b6f78dd7e update source files 2024-12-05 16:24:54 +08:00
4d297118e0 1. [current: do_current.py] 增加了 hw_sensor_trq_feedback 曲线的采集
2. [current: current.py] 增加了 hw_sensor_trq_feedback 曲线数据的处理,以及修改了之前数据处理的逻辑
3. [current: clibs.py] 新增可手动修改连接 IP 地址的功能,存储在 assets/templates/ipaddr.txt 中,默认是 192.168.0.160
2024-12-05 16:14:59 +08:00
5c5168442f update configs.xlsx 2024-10-25 14:21:21 +08:00
c9fa3a4473 modify current max senario time from 150s to 250s 2024-10-12 21:49:16 +08:00
880964f675 重新修改制动结束延迟时间洛杉矶 2024-10-12 16:09:58 +08:00
3481d3b496 每次制动完成之后,pending时间修改为3s 2024-10-10 09:19:43 +08:00
bebaf292ac 修复制动性能测试,采集的数据速度未降为0的问题 2024-10-09 16:00:47 +08:00
9f78b0e563 minor modification of UI for current data process 2024-09-20 14:13:54 +08:00
59711d9c65 4. [main: openapi.py]:新增 rl_task.set_run_params 指令支持,可设定速度滑块以及是否重复运行
5. [main: do_brake/do_current/factory_test.py]:在初始化运动时增加 `clibs.execution('rl_task.set_run_params', hr, w2t, tab_name, loop_mode=True, override=1.0)`
2024-08-20 18:03:44 +08:00
edafd91567 v0.2.0.8(2024/08/20)
1. [t_change_ui: clibs.py]
   - 从外部拷贝 icon.ico 文件到 templates 目录
   - 在 assets 目录新建 logs 目录,存放日志文件,并增加了相应的逻辑保证正常执行
2. [t_change_ui: aio.py]:增加 App 窗口图标代码
3. [t_change_ui: openapi.py]:将重复输出的网络错误提示,从 textbox 中转移到 debug.log 日志文件中
2024-08-20 11:13:45 +08:00
03b15751c2 add exception handle for openapi-selector 2024-08-17 09:28:33 +08:00
29bd4185c4 version change 2024-08-16 17:48:08 +08:00
97071d231f Merge branch 't_change_ui' 2024-08-16 17:23:25 +08:00
62e5e6ab50 README中增加整机自动化测试前需要调整速度滑块的提醒,完善current数据预处理的逻辑 2024-08-16 16:05:31 +08:00
8f342832b2 fix merge while merging from main 2024-08-16 15:57:00 +08:00
addb123a8a [current: current.py]: 在find_point函数种,当无法找到正确点位时,继续执行,而不是直接终止执行 2024-07-13 13:32:57 +08:00
20 changed files with 211 additions and 88 deletions

4
.gitignore vendored
View File

@ -7,10 +7,8 @@ aio/venv
aio/__pycache__/ aio/__pycache__/
aio/code/automatic_test/__pycache__/ aio/code/automatic_test/__pycache__/
aio/code/data_process/__pycache__/ aio/code/data_process/__pycache__/
aio/assets/templates/c_msg.log*
aio/code/durable_action/__pycache__/ aio/code/durable_action/__pycache__/
aio/assets/templates/durable/ aio/assets/templates/durable/
aio/assets/templates/.__c_msg.lock
aio/code/commons/__pycache__/ aio/code/commons/__pycache__/
aio/assets/templates/debug.log aio/assets/templates/logs/
dial_gauge/results.xlsx dial_gauge/results.xlsx

View File

@ -1,3 +0,0 @@
# rokae
测试内容自动化处理

View File

@ -34,7 +34,7 @@
打包时,只需要修改 clibs.py 中的 PREFIX 即可,调试时再修改回来 打包时,只需要修改 clibs.py 中的 PREFIX 即可,调试时再修改回来
``` ```
pyinstaller --noconfirm --onedir --windowed --optimize 2 --contents-directory . --upx-dir "D:/Syncthing/common/A_Program/upx-4.2.4-win64/" --add-data "C:/Users/Administrator/AppData/Local/Programs/Python/Python312/Lib/site-packages/customtkinter;customtkinter/" --add-data "D:/Syncthing/company/D-测试工作/X-自动化测试/01-AIO/rokae/aio/assets/templates:templates" --version-file ../assets/file_version_info.txt -i ../assets/icon.ico ../code/aio.py -p ../code/data_process/brake.py -p ../code/data_process/iso.py -p ../code/data_process/current.py -p ../code/data_process/wavelogger.py -p ../code/commons/openapi.py -p ../code/commons/clibs.py -p ../code/automatic_test/btn_functions.py -p ../code/automatic_test/do_current.py -p ../code/automatic_test/do_brake.py -p ../code/durable_action/factory_test.py pyinstaller --noconfirm --onedir --windowed --optimize 2 --contents-directory . --upx-dir "D:/Syncthing/common/A_Program/upx-4.2.4-win64/" --add-data "C:/Users/Administrator/AppData/Local/Programs/Python/Python312/Lib/site-packages/customtkinter;customtkinter/" --add-data "D:\Syncthing\company\D-测试工作\X-自动化测试\01-Gitea\aio\aio\assets\templates:templates" --version-file ../assets/file_version_info.txt -i ../assets/templates/icon.ico ../code/aio.py -p ../code/data_process/brake.py -p ../code/data_process/iso.py -p ../code/data_process/current.py -p ../code/data_process/wavelogger.py -p ../code/commons/openapi.py -p ../code/commons/clibs.py -p ../code/automatic_test/btn_functions.py -p ../code/automatic_test/do_current.py -p ../code/automatic_test/do_brake.py -p ../code/durable_action/factory_test.py
``` ```
--- ---
@ -146,6 +146,7 @@ pyinstaller --noconfirm --onedir --windowed --optimize 2 --contents-directory .
10. 由于xCore系统问题运行过程中可能会出现机器人宕机问题如果遇到可以手动重启控制柜重新运行 10. 由于xCore系统问题运行过程中可能会出现机器人宕机问题如果遇到可以手动重启控制柜重新运行
11. 务必正确填写configs.xlsx中的Target页面A1单元格可以选择正负方向急停 11. 务必正确填写configs.xlsx中的Target页面A1单元格可以选择正负方向急停
12. 工程文件可以手动重命名,按照机型存档,或者导出用于自动化测试 12. 工程文件可以手动重命名,按照机型存档,或者导出用于自动化测试
13. 可废弃但未验证自动化测试前需要将HMI程序速度设置为100%并同步至控制器,也即左下方速度滑条滑动至最大
#### 6) 电机电流自动化测试 #### 6) 电机电流自动化测试
@ -274,7 +275,7 @@ v0.1.5.1(2024/06/12)
5. [requirements.txt] 新增必要库配置文件 5. [requirements.txt] 新增必要库配置文件
v0.1.5.2(2024/06/13) v0.1.5.2(2024/06/13)
1. [brake.py/aio.py]: 将sto修改为estop 1. [brake.py/aio.py] 将sto修改为estop
2. [brake.py] 修改了速度计算逻辑新版本的vel列数据遵循如下规则av = vel * 180 / pi根据av再计算speed 2. [brake.py] 修改了速度计算逻辑新版本的vel列数据遵循如下规则av = vel * 180 / pi根据av再计算speed
3. [brake.py] 将threshold修改为常量50 3. [brake.py] 将threshold修改为常量50
4. [brake.py] 提高了输出提示语的明确性,删除了不必要的省略号 4. [brake.py] 提高了输出提示语的明确性,删除了不必要的省略号
@ -448,18 +449,18 @@ v0.1.7.6(2024/07/04)
3. [APIs: openapi.py] 3. [APIs: openapi.py]
- 增加了modbus读取浮点数的功能 - 增加了modbus读取浮点数的功能
- 优化了get_from_id的逻辑 - 优化了get_from_id的逻辑
4. [autotest.xml]: 新增了scenario_time只写寄存器 4. [autotest.xml] 新增了scenario_time只写寄存器
v0.1.8.0(2024/07/04) v0.1.8.0(2024/07/04)
1. [APIs: do_current.py]: 完成了堵转电流和惯量负载电机电流的采集和处理,至此,电机电流的自动化工作基本完成 1. [APIs: do_current.py] 完成了堵转电流和惯量负载电机电流的采集和处理,至此,电机电流的自动化工作基本完成
v0.1.8.1(2024/07/05) v0.1.8.1(2024/07/05)
1. [APIs: do_brake.py]: 完成了制动性能测试框架的搭建,可以顺利执行完整的测试程序,但是未实现急停和数据处理 1. [APIs: do_brake.py] 完成了制动性能测试框架的搭建,可以顺利执行完整的测试程序,但是未实现急停和数据处理
2. [APIs: aio.py]: 修改了do_brake主函数的参数 2. [APIs: aio.py] 修改了do_brake主函数的参数
3. 增加工程文件target.zip 3. 增加工程文件target.zip
v0.1.8.2(2024/07/08) v0.1.8.2(2024/07/08)
1. [APIs: do_brake.py]: 完成了制动性能测试逻辑只不过制动信号传递生效延迟不可控暂时pending 1. [APIs: do_brake.py] 完成了制动性能测试逻辑只不过制动信号传递生效延迟不可控暂时pending
2. [APIs: do_current.py]: 修改曲线数据时序主要是value data取反即可解决了波形锯齿明细的问题 2. [APIs: do_current.py]: 修改曲线数据时序主要是value data取反即可解决了波形锯齿明细的问题
3. [APIs: openapi.py]: modbus新增了触发急停信号的寄存器 stop0_signal并重写了解除急停socket新增了register.set_value协议 3. [APIs: openapi.py]: modbus新增了触发急停信号的寄存器 stop0_signal并重写了解除急停socket新增了register.set_value协议
@ -590,8 +591,8 @@ v0.2.0.5(2024/07/31)
- 保持电流,只取最后 15s - 保持电流,只取最后 15s
- 优化 ssh 输入密码的部分 - 优化 ssh 输入密码的部分
6. [t_change_ui: all the part]: 引入 commons 包,并定制了 logging 输出,后续持续优化 6. [t_change_ui: all the part]: 引入 commons 包,并定制了 logging 输出,后续持续优化
7. [APIs: btn_functions.py]: 重写了告警输出函数,从日志中拿数据 7. [APIs: btn_functions.py] 重写了告警输出函数,从日志中拿数据
8. [APIs: aio.py]: 将日志框输出的内容,也保存至日志文件 8. [APIs: aio.py] 将日志框输出的内容,也保存至日志文件
9. [APIs: do_brake.py] 9. [APIs: do_brake.py]
- 修改获取初始速度的逻辑只获取configs文件中配置的时间内的速度 - 修改获取初始速度的逻辑只获取configs文件中配置的时间内的速度
- 新增 configs 参数 single_brake可针对特定条件做测试 - 新增 configs 参数 single_brake可针对特定条件做测试
@ -616,4 +617,27 @@ v0.2.0.7(2024/08/16)
5. [t_change_ui: layout.xlsx]:修改了组件布局方式 5. [t_change_ui: layout.xlsx]:修改了组件布局方式
> 前两个修改点,修复的是网络提示的颜色不正确问题,因为日志将 textbox 中的内容也作为 debug 信息写入 hmi.log 了 > 前两个修改点,修复的是网络提示的颜色不正确问题,因为日志将 textbox 中的内容也作为 debug 信息写入 hmi.log 了
v0.2.0.8(2024/08/20)
1. [t_change_ui: clibs.py]
- 从外部拷贝 icon.ico 文件到 templates 目录
- 在 assets 目录新建 logs 目录,存放日志文件,并增加了相应的逻辑保证正常执行
2. [t_change_ui: aio.py]:增加 App 窗口图标代码
3. [t_change_ui: openapi.py]:将重复输出的网络错误提示,从 textbox 中转移到 debug.log 日志文件中
4. [main: openapi.py]:新增 rl_task.set_run_params 指令支持,可设定速度滑块以及是否重复运行
5. [main: do_brake/do_current/factory_test.py]:在初始化运动时增加 `clibs.execution('rl_task.set_run_params', hr, w2t, tab_name, loop_mode=True, override=1.0)`
v0.2.0.9(2024/10/09)
1. [main: do_brake.py] 采集完成后pending 3s使速度完全将为 0
v0.2.1.0(2024/12/05)
1. [current: do_current.py] 增加了 hw_sensor_trq_feedback 曲线的采集
2. [current: current.py] 增加了 hw_sensor_trq_feedback 曲线数据的处理,以及修改了之前数据处理的逻辑
3. [current: clibs.py] 新增可手动修改连接 IP 地址的功能,存储在 assets/templates/ipaddr.txt 中,默认是 192.168.0.160
v0.2.1.1(2024/12/16)
1. [main: do_brake.py] 修改了 SSH 的固定 IP 为 clibs 中读取的内容,并删除了每次都 reload 工程的动作,改为只在修改 RL 工程时 reload 一次旨在减少最近频繁出现的“无法获取overview.reload-xxxxxx”请求的响应初步判断是 xCore 的问题,非 AIO 问题,已反馈待版本修复
2. [main: wavelogger.py] 新增异常数据校验功能

Binary file not shown.

View File

@ -6,8 +6,8 @@ VSVersionInfo(
ffi=FixedFileInfo( ffi=FixedFileInfo(
# filevers and prodvers should be always a tuple with four items: (1, 2, 3, 4) # filevers and prodvers should be always a tuple with four items: (1, 2, 3, 4)
# Set not needed items to zero 0. # Set not needed items to zero 0.
filevers=(0, 2, 0, 5), filevers=(0, 2, 1, 1),
prodvers=(0, 2, 0, 5), prodvers=(0, 2, 1, 1),
# Contains a bitmask that specifies the valid bits 'flags'r # Contains a bitmask that specifies the valid bits 'flags'r
mask=0x3f, mask=0x3f,
# Contains a bitmask that specifies the Boolean attributes of the file. # Contains a bitmask that specifies the Boolean attributes of the file.
@ -31,12 +31,12 @@ VSVersionInfo(
'040904b0', '040904b0',
[StringStruct('CompanyName', 'Rokae - https://www.rokae.com/'), [StringStruct('CompanyName', 'Rokae - https://www.rokae.com/'),
StringStruct('FileDescription', 'All in one automatic toolbox'), StringStruct('FileDescription', 'All in one automatic toolbox'),
StringStruct('FileVersion', '0.2.0.5 (2024-08-02)'), StringStruct('FileVersion', '0.2.1.1 (2024-12-16)'),
StringStruct('InternalName', 'AIO.exe'), StringStruct('InternalName', 'AIO.exe'),
StringStruct('LegalCopyright', '© 2024-2024 Manford Fan'), StringStruct('LegalCopyright', '© 2024-2024 Manford Fan'),
StringStruct('OriginalFilename', 'AIO.exe'), StringStruct('OriginalFilename', 'AIO.exe'),
StringStruct('ProductName', 'AIO'), StringStruct('ProductName', 'AIO'),
StringStruct('ProductVersion', '0.2.0.5 (2024-08-02)')]) StringStruct('ProductVersion', '0.2.1.1 (2024-12-16)')])
]), ]),
VarFileInfo([VarStruct('Translation', [1033, 1200])]) VarFileInfo([VarStruct('Translation', [1033, 1200])])
] ]

Binary file not shown.

View File

Before

Width:  |  Height:  |  Size: 162 KiB

After

Width:  |  Height:  |  Size: 162 KiB

View File

@ -0,0 +1 @@
192.168.0.160

View File

@ -5,8 +5,8 @@
"data": { "data": {
"open": false, "open": false,
"display_open": false, "display_open": false,
"overrun": false, "overrun": true,
"turn_area": false, "turn_area": true,
"delay_motion": false "delay_motion": false
} }
} }

View File

@ -0,0 +1,9 @@
{
"id": "xxxxxxxxxxx",
"module": "project",
"command": "rl_task.set_run_params",
"data": {
"loop_mode": true,
"override": 1.0
}
}

View File

@ -1 +1 @@
0.2.0.5 @ 08/02/2024 0.2.1.1 @ 12/16/2024

View File

@ -39,7 +39,7 @@ btns_func = {
} }
widgets_dp = { widgets_dp = {
'path': {'label': '', 'entry': '', 'row': 0, 'col': 1, 'text': '数据文件夹路径'}, 'path': {'label': '', 'entry': '', 'row': 0, 'col': 1, 'text': '数据文件夹路径'},
'dur': {'label': '', 'entry': '', 'row': 1, 'col': 1, 'text': '周期时间'}, 'dur': {'label': '', 'entry': '', 'row': 1, 'col': 9, 'text': '周期时间'},
'vel': {'label': '', 'optionmenu': '', 'row': 1, 'col': 1, 'text': ''}, 'vel': {'label': '', 'optionmenu': '', 'row': 1, 'col': 1, 'text': ''},
'trq': {'label': '', 'optionmenu': '', 'row': 1, 'col': 3, 'text': ''}, 'trq': {'label': '', 'optionmenu': '', 'row': 1, 'col': 3, 'text': ''},
'trqh': {'label': '', 'optionmenu': '', 'row': 1, 'col': 5, 'text': ''}, 'trqh': {'label': '', 'optionmenu': '', 'row': 1, 'col': 5, 'text': ''},
@ -70,14 +70,14 @@ class App(customtkinter.CTk):
# ===================================================================== # =====================================================================
# configure window # configure window
self.title("AIO - All in one automatic toolbox") self.title("AIO - All in one automatic toolbox")
# self.iconbitmap('./icon.ico') self.wm_iconbitmap(clibs.app_icon)
self.geometry("1200x550+30+30") self.geometry("1200x550+30+30")
self.protocol("WM_DELETE_WINDOW", self.func_end_callback) self.protocol("WM_DELETE_WINDOW", self.func_end_callback)
self.config(bg='#E9E9E9') self.config(bg='#E9E9E9')
self.rowconfigure(0, weight=1) self.grid_rowconfigure(0, weight=1)
self.rowconfigure(1, weight=19) self.grid_rowconfigure(1, weight=19)
self.columnconfigure(0, weight=1) self.grid_columnconfigure(0, weight=1)
self.columnconfigure(1, weight=19) self.grid_columnconfigure(1, weight=19)
self.minsize(1200, 550) self.minsize(1200, 550)
# ===================================================================== # =====================================================================
# 1. create frame sidebar(left) # 1. create frame sidebar(left)
@ -96,7 +96,7 @@ class App(customtkinter.CTk):
btns_func['log']['btn'].configure(command=lambda: self.thread_it(self.func_log_callback)) btns_func['log']['btn'].configure(command=lambda: self.thread_it(self.func_log_callback))
btns_func['end']['btn'].configure(command=lambda: self.thread_it(self.func_end_callback)) btns_func['end']['btn'].configure(command=lambda: self.thread_it(self.func_end_callback))
# 1.3 create version info # 1.3 create version info
self.label_version = customtkinter.CTkLabel(self.frame_func, justify='left', text="Vers: 0.2.0.5\nDate: 08/02/2024", font=self.my_font, text_color="#4F4F4F") self.label_version = customtkinter.CTkLabel(self.frame_func, justify='left', text="Vers: 0.2.1.1\nDate: 12/16/2024", font=self.my_font, text_color="#4F4F4F")
self.frame_func.rowconfigure(6, weight=1) self.frame_func.rowconfigure(6, weight=1)
self.label_version.grid(row=6, column=0, padx=20, pady=20, sticky='s') self.label_version.grid(row=6, column=0, padx=20, pady=20, sticky='s')
# ===================================================================== # =====================================================================

View File

@ -155,10 +155,10 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
sleep(write_diagnosis) # 软急停超差后等待写诊断时间可通过configs.xlsx配置 sleep(write_diagnosis) # 软急停超差后等待写诊断时间可通过configs.xlsx配置
while count == 1: while count == 1:
# 2. 修改要执行的场景 # 2. 修改要执行的场景
ssh = SSHClient() ssh = SSHClient()
ssh.set_missing_host_key_policy(AutoAddPolicy()) ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect('192.168.0.160', 22, username='luoshi', password='luoshi2019') ssh.connect(hostname=clibs.ip_addr, port=22, username='luoshi', password='luoshi2019')
if ws.cell(row=1, column=1).value == 'positive': if ws.cell(row=1, column=1).value == 'positive':
_rl_cmd = f"brake_E(j{axis}_{_reach}_p, j{axis}_{_reach}_n, p_speed, p_tool)" _rl_cmd = f"brake_E(j{axis}_{_reach}_p, j{axis}_{_reach}_n, p_speed, p_tool)"
elif ws.cell(row=1, column=1).value == 'negative': elif ws.cell(row=1, column=1).value == 'negative':
@ -184,6 +184,7 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
clibs.execution('rl_task.pp_to_main', hr, w2t, tab_name, tasks=['brake', 'stop0_related']) clibs.execution('rl_task.pp_to_main', hr, w2t, tab_name, tasks=['brake', 'stop0_related'])
clibs.execution('state.switch_auto', hr, w2t, tab_name) clibs.execution('state.switch_auto', hr, w2t, tab_name)
clibs.execution('state.switch_motor_on', hr, w2t, tab_name) clibs.execution('state.switch_motor_on', hr, w2t, tab_name)
clibs.execution('rl_task.set_run_params', hr, w2t, tab_name, loop_mode=True, override=1.0)
clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['brake', 'stop0_related']) clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['brake', 'stop0_related'])
_t_start = time() _t_start = time()
while True: while True:
@ -237,7 +238,7 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
md.reset_estop() # 其实没必要 md.reset_estop() # 其实没必要
md.clear_alarm() md.clear_alarm()
clibs.execution('overview.reload', hr, w2t, tab_name, prj_path=prj_path, tasks=['brake', 'stop0_related']) # clibs.execution('overview.reload', hr, w2t, tab_name, prj_path=prj_path, tasks=['brake', 'stop0_related'])
clibs.execution('rl_task.pp_to_main', hr, w2t, tab_name, tasks=['brake', 'stop0_related']) clibs.execution('rl_task.pp_to_main', hr, w2t, tab_name, tasks=['brake', 'stop0_related'])
clibs.execution('state.switch_auto', hr, w2t, tab_name) clibs.execution('state.switch_auto', hr, w2t, tab_name)
clibs.execution('state.switch_motor_on', hr, w2t, tab_name) clibs.execution('state.switch_motor_on', hr, w2t, tab_name)
@ -256,7 +257,7 @@ def run_rl(path, loadsel, hr, md, config_file, result_dirs, w2t):
_t_start = time() _t_start = time()
while True: while True:
if md.read_brake_done() == 1: if md.read_brake_done() == 1:
sleep(1) # 保证速度归零 sleep(4) # 保证速度归零
md.write_probe(0) md.write_probe(0)
break break
else: else:

View File

@ -21,6 +21,12 @@ display_pdo_params = [
{"name": "device_servo_trq_feedback", "channel": 3}, {"name": "device_servo_trq_feedback", "channel": 3},
{"name": "device_servo_trq_feedback", "channel": 4}, {"name": "device_servo_trq_feedback", "channel": 4},
{"name": "device_servo_trq_feedback", "channel": 5}, {"name": "device_servo_trq_feedback", "channel": 5},
{"name": "hw_sensor_trq_feedback", "channel": 0},
{"name": "hw_sensor_trq_feedback", "channel": 1},
{"name": "hw_sensor_trq_feedback", "channel": 2},
{"name": "hw_sensor_trq_feedback", "channel": 3},
{"name": "hw_sensor_trq_feedback", "channel": 4},
{"name": "hw_sensor_trq_feedback", "channel": 5},
] ]
@ -63,6 +69,7 @@ def data_proc_regular(path, filename, channel, scenario_time):
lines = f_obj.readlines() lines = f_obj.readlines()
_d2d_vel = {'hw_joint_vel_feedback': []} _d2d_vel = {'hw_joint_vel_feedback': []}
_d2d_trq = {'device_servo_trq_feedback': []} _d2d_trq = {'device_servo_trq_feedback': []}
_d2d_sensor = {'hw_sensor_trq_feedback': []}
for line in lines[-500:]: # 保留最后25s的数据 for line in lines[-500:]: # 保留最后25s的数据
data = eval(line.strip())['data'] data = eval(line.strip())['data']
for item in data: for item in data:
@ -74,10 +81,13 @@ def data_proc_regular(path, filename, channel, scenario_time):
_d2d_vel['hw_joint_vel_feedback'].extend(item['value']) _d2d_vel['hw_joint_vel_feedback'].extend(item['value'])
elif item.get('channel', None) == channel and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == channel and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq['device_servo_trq_feedback'].extend(item['value']) _d2d_trq['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == channel and item.get('name', None) == 'hw_sensor_trq_feedback':
_d2d_sensor['hw_sensor_trq_feedback'].extend(item['value'])
df1 = DataFrame.from_dict(_d2d_vel) df1 = DataFrame.from_dict(_d2d_vel)
df2 = DataFrame.from_dict(_d2d_trq) df2 = DataFrame.from_dict(_d2d_trq)
df = concat([df1, df2], axis=1) df3 = DataFrame.from_dict(_d2d_sensor)
df = concat([df1, df2, df3], axis=1)
_filename = f'{path}\\single\\j{channel+1}_single_{time()}.data' _filename = f'{path}\\single\\j{channel+1}_single_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
elif channel in list(range(6, 9)): elif channel in list(range(6, 9)):
@ -85,16 +95,22 @@ def data_proc_regular(path, filename, channel, scenario_time):
lines = f_obj.readlines() lines = f_obj.readlines()
_d2d_vel_0 = {'hw_joint_vel_feedback': []} _d2d_vel_0 = {'hw_joint_vel_feedback': []}
_d2d_trq_0 = {'device_servo_trq_feedback': []} _d2d_trq_0 = {'device_servo_trq_feedback': []}
_d2d_sensor_0 = {'hw_sensor_trq_feedback': []}
_d2d_vel_1 = {'hw_joint_vel_feedback': []} _d2d_vel_1 = {'hw_joint_vel_feedback': []}
_d2d_trq_1 = {'device_servo_trq_feedback': []} _d2d_trq_1 = {'device_servo_trq_feedback': []}
_d2d_sensor_1 = {'hw_sensor_trq_feedback': []}
_d2d_vel_2 = {'hw_joint_vel_feedback': []} _d2d_vel_2 = {'hw_joint_vel_feedback': []}
_d2d_trq_2 = {'device_servo_trq_feedback': []} _d2d_trq_2 = {'device_servo_trq_feedback': []}
_d2d_sensor_2 = {'hw_sensor_trq_feedback': []}
_d2d_vel_3 = {'hw_joint_vel_feedback': []} _d2d_vel_3 = {'hw_joint_vel_feedback': []}
_d2d_trq_3 = {'device_servo_trq_feedback': []} _d2d_trq_3 = {'device_servo_trq_feedback': []}
_d2d_sensor_3 = {'hw_sensor_trq_feedback': []}
_d2d_vel_4 = {'hw_joint_vel_feedback': []} _d2d_vel_4 = {'hw_joint_vel_feedback': []}
_d2d_trq_4 = {'device_servo_trq_feedback': []} _d2d_trq_4 = {'device_servo_trq_feedback': []}
_d2d_sensor_4 = {'hw_sensor_trq_feedback': []}
_d2d_vel_5 = {'hw_joint_vel_feedback': []} _d2d_vel_5 = {'hw_joint_vel_feedback': []}
_d2d_trq_5 = {'device_servo_trq_feedback': []} _d2d_trq_5 = {'device_servo_trq_feedback': []}
_d2d_sensor_5 = {'hw_sensor_trq_feedback': []}
for line in lines: for line in lines:
data = eval(line.strip())['data'] data = eval(line.strip())['data']
for item in data: for item in data:
@ -106,60 +122,78 @@ def data_proc_regular(path, filename, channel, scenario_time):
_d2d_vel_0['hw_joint_vel_feedback'].extend(item['value']) _d2d_vel_0['hw_joint_vel_feedback'].extend(item['value'])
elif item.get('channel', None) == 0 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == 0 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq_0['device_servo_trq_feedback'].extend(item['value']) _d2d_trq_0['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 0 and item.get('name', None) == 'hw_sensor_trq_feedback':
_d2d_sensor_0['hw_sensor_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 1 and item.get('name', None) == 'hw_joint_vel_feedback': elif item.get('channel', None) == 1 and item.get('name', None) == 'hw_joint_vel_feedback':
_d2d_vel_1['hw_joint_vel_feedback'].extend(item['value']) _d2d_vel_1['hw_joint_vel_feedback'].extend(item['value'])
elif item.get('channel', None) == 1 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == 1 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq_1['device_servo_trq_feedback'].extend(item['value']) _d2d_trq_1['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 1 and item.get('name', None) == 'hw_sensor_trq_feedback':
_d2d_sensor_1['hw_sensor_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 2 and item.get('name', None) == 'hw_joint_vel_feedback': elif item.get('channel', None) == 2 and item.get('name', None) == 'hw_joint_vel_feedback':
_d2d_vel_2['hw_joint_vel_feedback'].extend(item['value']) _d2d_vel_2['hw_joint_vel_feedback'].extend(item['value'])
elif item.get('channel', None) == 2 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == 2 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq_2['device_servo_trq_feedback'].extend(item['value']) _d2d_trq_2['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 3 and item.get('name', None) == 'hw_sensor_trq_feedback':
_d2d_sensor_2['hw_sensor_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 3 and item.get('name', None) == 'hw_joint_vel_feedback': elif item.get('channel', None) == 3 and item.get('name', None) == 'hw_joint_vel_feedback':
_d2d_vel_3['hw_joint_vel_feedback'].extend(item['value']) _d2d_vel_3['hw_joint_vel_feedback'].extend(item['value'])
elif item.get('channel', None) == 3 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == 3 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq_3['device_servo_trq_feedback'].extend(item['value']) _d2d_trq_3['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 3 and item.get('name', None) == 'hw_sensor_trq_feedback':
_d2d_sensor_3['hw_sensor_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 4 and item.get('name', None) == 'hw_joint_vel_feedback': elif item.get('channel', None) == 4 and item.get('name', None) == 'hw_joint_vel_feedback':
_d2d_vel_4['hw_joint_vel_feedback'].extend(item['value']) _d2d_vel_4['hw_joint_vel_feedback'].extend(item['value'])
elif item.get('channel', None) == 4 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == 4 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq_4['device_servo_trq_feedback'].extend(item['value']) _d2d_trq_4['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 4 and item.get('name', None) == 'hw_sensor_trq_feedback':
_d2d_sensor_4['hw_sensor_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 5 and item.get('name', None) == 'hw_joint_vel_feedback': elif item.get('channel', None) == 5 and item.get('name', None) == 'hw_joint_vel_feedback':
_d2d_vel_5['hw_joint_vel_feedback'].extend(item['value']) _d2d_vel_5['hw_joint_vel_feedback'].extend(item['value'])
elif item.get('channel', None) == 5 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == 5 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq_5['device_servo_trq_feedback'].extend(item['value']) _d2d_trq_5['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == 5 and item.get('name', None) == 'hw_sensor_trq_feedback':
_d2d_sensor_5['hw_sensor_trq_feedback'].extend(item['value'])
df_01 = DataFrame.from_dict(_d2d_vel_0) df_01 = DataFrame.from_dict(_d2d_vel_0)
df_02 = DataFrame.from_dict(_d2d_trq_0) df_02 = DataFrame.from_dict(_d2d_trq_0)
df = concat([df_01, df_02], axis=1) df_03 = DataFrame.from_dict(_d2d_sensor_0)
df = concat([df_01, df_02, df_03], axis=1)
_filename = f'{path}\\s_{channel-5}\\j1_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j1_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = DataFrame.from_dict(_d2d_vel_1) df_01 = DataFrame.from_dict(_d2d_vel_1)
df_02 = DataFrame.from_dict(_d2d_trq_1) df_02 = DataFrame.from_dict(_d2d_trq_1)
df = concat([df_01, df_02], axis=1) df_03 = DataFrame.from_dict(_d2d_sensor_1)
df = concat([df_01, df_02, df_03], axis=1)
_filename = f'{path}\\s_{channel-5}\\j2_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j2_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = DataFrame.from_dict(_d2d_vel_2) df_01 = DataFrame.from_dict(_d2d_vel_2)
df_02 = DataFrame.from_dict(_d2d_trq_2) df_02 = DataFrame.from_dict(_d2d_trq_2)
df = concat([df_01, df_02], axis=1) df_03 = DataFrame.from_dict(_d2d_sensor_2)
df = concat([df_01, df_02, df_03], axis=1)
_filename = f'{path}\\s_{channel-5}\\j3_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j3_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = DataFrame.from_dict(_d2d_vel_3) df_01 = DataFrame.from_dict(_d2d_vel_3)
df_02 = DataFrame.from_dict(_d2d_trq_3) df_02 = DataFrame.from_dict(_d2d_trq_3)
df = concat([df_01, df_02], axis=1) df_03 = DataFrame.from_dict(_d2d_sensor_3)
df = concat([df_01, df_02, df_03], axis=1)
_filename = f'{path}\\s_{channel-5}\\j4_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j4_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = DataFrame.from_dict(_d2d_vel_4) df_01 = DataFrame.from_dict(_d2d_vel_4)
df_02 = DataFrame.from_dict(_d2d_trq_4) df_02 = DataFrame.from_dict(_d2d_trq_4)
df = concat([df_01, df_02], axis=1) df_03 = DataFrame.from_dict(_d2d_sensor_4)
df = concat([df_01, df_02, df_03], axis=1)
_filename = f'{path}\\s_{channel-5}\\j5_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j5_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
df_01 = DataFrame.from_dict(_d2d_vel_5) df_01 = DataFrame.from_dict(_d2d_vel_5)
df_02 = DataFrame.from_dict(_d2d_trq_5) df_02 = DataFrame.from_dict(_d2d_trq_5)
df = concat([df_01, df_02], axis=1) df_03 = DataFrame.from_dict(_d2d_sensor_5)
df = concat([df_01, df_02, df_03], axis=1)
_filename = f'{path}\\s_{channel-5}\\j6_s_{channel-5}_{scenario_time}_{time()}.data' _filename = f'{path}\\s_{channel-5}\\j6_s_{channel-5}_{scenario_time}_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
elif channel in list(range(9, 15)): elif channel in list(range(9, 15)):
@ -167,6 +201,7 @@ def data_proc_regular(path, filename, channel, scenario_time):
lines = f_obj.readlines() lines = f_obj.readlines()
_d2d_vel = {'hw_joint_vel_feedback': []} _d2d_vel = {'hw_joint_vel_feedback': []}
_d2d_trq = {'device_servo_trq_feedback': []} _d2d_trq = {'device_servo_trq_feedback': []}
_d2d_sensor = {'hw_sensor_trq_feedback': []}
for line in lines[-300:]: # 保留最后15s的数据 for line in lines[-300:]: # 保留最后15s的数据
data = eval(line.strip())['data'] data = eval(line.strip())['data']
for item in data: for item in data:
@ -178,10 +213,13 @@ def data_proc_regular(path, filename, channel, scenario_time):
_d2d_vel['hw_joint_vel_feedback'].extend(item['value']) _d2d_vel['hw_joint_vel_feedback'].extend(item['value'])
elif item.get('channel', None) == channel-9 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == channel-9 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq['device_servo_trq_feedback'].extend(item['value']) _d2d_trq['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == channel-9 and item.get('name', None) == 'hw_sensor_trq_feedback':
_d2d_sensor['hw_sensor_trq_feedback'].extend(item['value'])
df1 = DataFrame.from_dict(_d2d_vel) df1 = DataFrame.from_dict(_d2d_vel)
df2 = DataFrame.from_dict(_d2d_trq) df2 = DataFrame.from_dict(_d2d_trq)
df = concat([df1, df2], axis=1) df3 = DataFrame.from_dict(_d2d_sensor)
df = concat([df1, df2, df3], axis=1)
_filename = f'{path}\\single\\j{channel-8}_hold_{time()}.data' _filename = f'{path}\\single\\j{channel-8}_hold_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
@ -191,6 +229,7 @@ def data_proc_inertia(path, filename, channel):
lines = f_obj.readlines() lines = f_obj.readlines()
_d2d_vel = {'hw_joint_vel_feedback': []} _d2d_vel = {'hw_joint_vel_feedback': []}
_d2d_trq = {'device_servo_trq_feedback': []} _d2d_trq = {'device_servo_trq_feedback': []}
_d2d_sensor = {'hw_sensor_trq_feedback': []}
for line in lines: for line in lines:
data = eval(line.strip())['data'] data = eval(line.strip())['data']
for item in data: for item in data:
@ -202,10 +241,13 @@ def data_proc_inertia(path, filename, channel):
_d2d_vel['hw_joint_vel_feedback'].extend(item['value']) _d2d_vel['hw_joint_vel_feedback'].extend(item['value'])
elif item.get('channel', None) == channel+3 and item.get('name', None) == 'device_servo_trq_feedback': elif item.get('channel', None) == channel+3 and item.get('name', None) == 'device_servo_trq_feedback':
_d2d_trq['device_servo_trq_feedback'].extend(item['value']) _d2d_trq['device_servo_trq_feedback'].extend(item['value'])
elif item.get('channel', None) == channel+3 and item.get('name', None) == 'hw_sensor_trq_feedback':
_d2d_trq['hw_sensor_trq_feedback'].extend(item['value'])
df1 = DataFrame.from_dict(_d2d_vel) df1 = DataFrame.from_dict(_d2d_vel)
df2 = DataFrame.from_dict(_d2d_trq) df2 = DataFrame.from_dict(_d2d_trq)
df = concat([df1, df2], axis=1) df3 = DataFrame.from_dict(_d2d_sensor)
df = concat([df1, df2, df3], axis=1)
_filename = f'{path}\\inertia\\j{channel+4}_inertia_{time()}.data' _filename = f'{path}\\inertia\\j{channel+4}_inertia_{time()}.data'
df.to_csv(_filename, sep='\t', index=False) df.to_csv(_filename, sep='\t', index=False)
@ -290,6 +332,7 @@ def run_rl(path, hr, md, loadsel, w2t):
clibs.execution('state.switch_motor_on', hr, w2t, tab_name) clibs.execution('state.switch_motor_on', hr, w2t, tab_name)
# 3. 开始运行程序单轴运行35s # 3. 开始运行程序单轴运行35s
clibs.execution('rl_task.set_run_params', hr, w2t, tab_name, loop_mode=True, override=1.0)
clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['current']) clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['current'])
_t_start = time() _t_start = time()
while True: while True:

View File

@ -1,4 +1,4 @@
from os import scandir from os import scandir, mkdir
from threading import Thread from threading import Thread
from time import sleep from time import sleep
from os.path import exists from os.path import exists
@ -8,17 +8,18 @@ from logging import getLogger
from logging.config import dictConfig from logging.config import dictConfig
import concurrent_log_handler import concurrent_log_handler
ip_addr = '192.168.0.160' # for product
# ip_addr = '192.168.84.129' # for test
RADIAN = 57.3 # 180 / 3.1415926 RADIAN = 57.3 # 180 / 3.1415926
MAX_FRAME_SIZE = 1024 MAX_FRAME_SIZE = 1024
TIMEOUT = 5 TIMEOUT = 5
setdefaulttimeout(TIMEOUT) setdefaulttimeout(TIMEOUT)
tab_names = {'dp': 'Data Process', 'at': 'Automatic Test', 'da': 'Duration Action', 'op': 'openapi'} tab_names = {'dp': 'Data Process', 'at': 'Automatic Test', 'da': 'Duration Action', 'op': 'openapi'}
# PREFIX = '' # for pyinstaller packaging # PREFIX = '' # for pyinstaller packaging
PREFIX = '../assets/' # for source code debug PREFIX = '../assets/' # for source code testing and debug
log_data_hmi = f'{PREFIX}templates/c_msg.log' app_icon = f'{PREFIX}templates/icon.ico'
log_data_debug = f'{PREFIX}templates/debug.log' ip_file = f'{PREFIX}templates/ipaddr.txt'
log_path = f'{PREFIX}templates/logs/'
log_data_hmi = f'{PREFIX}templates/logs/c_msg.log'
log_data_debug = f'{PREFIX}templates/logs/debug.log'
heartbeat = f'{PREFIX}templates/heartbeat' heartbeat = f'{PREFIX}templates/heartbeat'
durable_data_current_xlsx = f'{PREFIX}templates/durable/durable_data_current.xlsx' durable_data_current_xlsx = f'{PREFIX}templates/durable/durable_data_current.xlsx'
durable_data_current_max_xlsx = f'{PREFIX}templates/durable/durable_data_current_max.xlsx' durable_data_current_max_xlsx = f'{PREFIX}templates/durable/durable_data_current_max.xlsx'
@ -40,6 +41,17 @@ durable_data_current_max = {
'axis5': [0 for _ in range(18)], 'axis5': [0 for _ in range(18)],
'axis6': [0 for _ in range(18)], 'axis6': [0 for _ in range(18)],
} }
try:
with open(ip_file, mode="r", encoding="utf-8") as f_ipaddr:
ip_addr = f_ipaddr.read().strip()
except:
ip_addr = '192.168.0.160'
# ip_addr = '192.168.0.160' # for product
# ip_addr = '192.168.84.129' # for test
if not exists(log_path):
mkdir(log_path)
# version表示版本该键值为从1开始的整数。该key必选除此之外其它key都是可选。 # version表示版本该键值为从1开始的整数。该key必选除此之外其它key都是可选。
# formatters日志格式化器其value值为一个字典该字典的每个键值对都代表一个Formatter键值对中key代表Formatter ID(自定义ID)value为字典描述如何配置相应的Formatter实例。默认格式为 %(message)s # formatters日志格式化器其value值为一个字典该字典的每个键值对都代表一个Formatter键值对中key代表Formatter ID(自定义ID)value为字典描述如何配置相应的Formatter实例。默认格式为 %(message)s

View File

@ -19,7 +19,7 @@ class ModbusRequest(object):
self.host = clibs.ip_addr self.host = clibs.ip_addr
self.port = 502 self.port = 502
self.interval = 0.3 self.interval = 0.3
self.c = ModbusTcpClient(self.host, self.port) self.c = ModbusTcpClient(host=self.host, port=self.port)
self.c.connect() self.c.connect()
def motor_off(self): def motor_off(self):
@ -203,7 +203,7 @@ class HmiRequest(object):
self.c_xs.connect((clibs.ip_addr, 6666)) self.c_xs.connect((clibs.ip_addr, 6666))
self.c_xs.setblocking(False) self.c_xs.setblocking(False)
self.w2t("Connection success", 0, 0, 'green', tab_name=self.tab_name) logger.info("Connection success...")
with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb: with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb:
f_hb.write('1') f_hb.write('1')
md = ModbusRequest(self.w2t) md = ModbusRequest(self.w2t)
@ -213,7 +213,7 @@ class HmiRequest(object):
md.write_probe(False) md.write_probe(False)
md.write_axis(1) md.write_axis(1)
except Exception as Err: except Exception as Err:
self.w2t("Connection failed...", 0, 0, 'red', tab_name=self.tab_name) logger.info("Connection failed...")
with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb: with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb:
f_hb.write('0') f_hb.write('0')
@ -255,7 +255,7 @@ class HmiRequest(object):
with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb: with open(clibs.heartbeat, "w", encoding='utf-8') as f_hb:
f_hb.write(_flag) f_hb.write(_flag)
if _flag == '0': if _flag == '0':
self.w2t(f"{_id} 心跳丢失,连接失败,重新连接中...", 0, 7, 'red', tab_name=self.tab_name) logger.info(f"{_id} 心跳丢失,连接失败,重新连接中...")
sleep(2) sleep(2)
def msg_storage(self, response, flag=0): def msg_storage(self, response, flag=0):
@ -539,14 +539,17 @@ class HmiRequest(object):
sel.unregister(conn) sel.unregister(conn)
conn.close() conn.close()
sel = selectors.DefaultSelector() try:
sel.register(sock, selectors.EVENT_READ, to_read) sel = selectors.DefaultSelector()
sel.register(sock, selectors.EVENT_READ, to_read)
while self.t_bool: while self.t_bool:
events = sel.select() events = sel.select()
for key, mask in events: for key, mask in events:
callback = key.data callback = key.data
callback(key.fileobj, mask) callback(key.fileobj, mask)
except Exception as Err:
logger.warning(Err)
def unpackage_xs(self, sock): def unpackage_xs(self, sock):
def to_read(conn, mask): def to_read(conn, mask):
@ -559,14 +562,17 @@ class HmiRequest(object):
sel.unregister(conn) sel.unregister(conn)
conn.close() conn.close()
sel = selectors.DefaultSelector() try:
sel.register(sock, selectors.EVENT_READ, to_read) sel = selectors.DefaultSelector()
sel.register(sock, selectors.EVENT_READ, to_read)
while self.t_bool: while self.t_bool:
events = sel.select() events = sel.select()
for key, mask in events: for key, mask in events:
callback = key.data callback = key.data
callback(key.fileobj, mask) callback(key.fileobj, mask)
except Exception as Err:
logger.warning(Err)
def gen_id(self, command): def gen_id(self, command):
_now = time() _now = time()
@ -593,6 +599,9 @@ class HmiRequest(object):
req['data']['tasks'] = kwargs['tasks'] req['data']['tasks'] = kwargs['tasks']
case 'rl_task.pp_to_main' | 'rl_task.run' | 'rl_task.stop': case 'rl_task.pp_to_main' | 'rl_task.run' | 'rl_task.stop':
req['data']['tasks'] = kwargs['tasks'] req['data']['tasks'] = kwargs['tasks']
case 'rl_task.set_run_params':
req['data']['loop_mode'] = kwargs['loop_mode']
req['data']['override'] = kwargs['override']
case 'diagnosis.set_params': case 'diagnosis.set_params':
req['data']['display_pdo_params'] = kwargs['display_pdo_params'] req['data']['display_pdo_params'] = kwargs['display_pdo_params']
case 'diagnosis.open': case 'diagnosis.open':
@ -615,7 +624,8 @@ class HmiRequest(object):
self.c.send(self.package(cmd)) self.c.send(self.package(cmd))
sleep(0.5) sleep(0.5)
except Exception as Err: except Exception as Err:
self.w2t(f"{cmd}: 请求发送失败...{Err}", 0, 0, 'red', tab_name=self.tab_name) # self.w2t(f"{cmd}: 请求发送失败...{Err}", 0, 0, 'red', tab_name=self.tab_name)
logger.info(f"{cmd}: 请求发送失败...{Err}")
return req['id'] return req['id']

View File

@ -28,16 +28,17 @@ def initialization(path, sub, w2t):
filename = data_file.split('\\')[-1] filename = data_file.split('\\')[-1]
if data_file.endswith('configs.xlsx'): if data_file.endswith('configs.xlsx'):
count += 1 count += 1
elif sub == 'cycle' and data_file.endswith('.xlsx'): elif sub == 'cycle' and data_file.endswith('T_电机电流.xlsx'):
count += 1 count += 1
else: else:
if not (match('j[1-7].*\\.data', filename) or match('j[1-7].*\\.csv', filename)): if not (match('j[1-7].*\\.data', filename) or match('j[1-7].*\\.csv', filename)):
msg = f"不合规 {data_file}\n" msg = f"不合规 {data_file}\n"
msg += f"所有文件必须以 jx_ 开头,以 .data/csv 结尾x取值1-7请检查后重新运行。" msg += f"所有数据文件必须以 jx_ 开头,以 .data/csv 结尾x取值1-7配置文件需要命名为\"configs.xlsx\",结果文件需要命名为\"T_电机电流.xlsx\"请检查后重新运行。\n"
msg += "使用max/avg功能时需要有配置文件表格\"configs.xlsx\"使用cycle功能时需要有电机电流数据处理\"T_电机电流.xlsx\"和配置文件\"configs.xlsx\"两个表格,确认后重新运行!"
w2t(msg, 0, 6, 'red') w2t(msg, 0, 6, 'red')
if not ((sub == 'cycle' and count == 2) or (sub != 'cycle' and count == 1)): if not ((sub == 'cycle' and count == 2) or (sub != 'cycle' and count == 1)):
w2t("使用max/avg功能时需要有配置文件表格使用cycle功能时需要有电机电流数据处理和配置文件两个表格,确认后重新运行!", 0, 5, 'red') w2t("使用max/avg功能时需要有配置文件表格\"configs.xlsx\"使用cycle功能时需要有电机电流数据处理\"T_电机电流.xlsx\"和配置文件\"configs.xlsx\"两个表格,确认后重新运行!", 0, 5, 'red')
return data_files return data_files
@ -186,7 +187,7 @@ def find_point(data_file, pos, flag, df, _row_s, _row_e, w2t, exitcode, threshol
else: else:
return _row_s, _row_e return _row_s, _row_e
else: else:
w2t(f"[{pos}] {data_file}数据有误,需要检查,无法找到第{exitcode}个有效点...", 0, exitcode, 'red') w2t(f"[{pos}] {data_file}数据有误,需要检查,无法找到第{exitcode}个有效点...", 0, 0, 'red')
elif flag == 'gt': elif flag == 'gt':
while _row_e > end_point: while _row_e > end_point:
speed_avg = df.iloc[_row_s:_row_e, 0].abs().mean() speed_avg = df.iloc[_row_s:_row_e, 0].abs().mean()
@ -197,7 +198,7 @@ def find_point(data_file, pos, flag, df, _row_s, _row_e, w2t, exitcode, threshol
else: else:
return _row_s, _row_e return _row_s, _row_e
else: else:
w2t(f"[{pos}] {data_file}数据有误,需要检查,无法找到有效起始点或结束点...", 0, exitcode, 'red') w2t(f"[{pos}] {data_file}数据有误,需要检查,无法找到有效起始点或结束点...", 0, 0, 'red')
def p_single(wb, single, vel, trq, rpms, rrs, w2t): def p_single(wb, single, vel, trq, rpms, rrs, w2t):
@ -232,6 +233,7 @@ def p_single(wb, single, vel, trq, rpms, rrs, w2t):
df_1 = df[col_names[vel-1]].multiply(rpm*addition) df_1 = df[col_names[vel-1]].multiply(rpm*addition)
df_2 = df[col_names[trq-1]].multiply(scale) df_2 = df[col_names[trq-1]].multiply(scale)
# print(df_1.abs().max()) # print(df_1.abs().max())
df_origin = df
df = concat([df_1, df_2], axis=1) df = concat([df_1, df_2], axis=1)
_step = 5 if data_file.endswith('.csv') else 50 _step = 5 if data_file.endswith('.csv') else 50
@ -247,32 +249,40 @@ def p_single(wb, single, vel, trq, rpms, rrs, w2t):
row_end = _row_e - _adjust row_end = _row_e - _adjust
_row_e -= _end_point _row_e -= _end_point
_row_s -= _end_point _row_s -= _end_point
_row_s, _row_e = find_point(data_file, 'a2', 'gt', df, _row_s, _row_e, w2t, 3, threshold=5, step=_step, end_point=_end_point) _row_s, _row_e = find_point(data_file, 'a2', 'gt', df, _row_s, _row_e, w2t, 2, threshold=5, step=_step, end_point=_end_point)
# 速度已经快要降为零了,继续寻找下一个速度上升点 # 速度已经快要降为零了,继续寻找下一个速度上升点
_row_middle = _row_s
_row_e -= _end_point _row_e -= _end_point
_row_s -= _end_point _row_s -= _end_point
_row_s, _row_e = find_point(data_file, 'a3', 'lt', df, _row_s, _row_e, w2t, 3, threshold=5, step=_step, end_point=_end_point) _row_s, _row_e = find_point(data_file, 'a3', 'lt', df, _row_s, _row_e, w2t, 3, threshold=5, step=_step, end_point=_end_point)
if abs((_row_s-_row_middle)-(_row_middle-row_end)) > 1000: # 两段相差太大判定后者较小很罕见的情况所以只假设之前遇到的情况所以可能有bug
row_end = _row_middle
_row_e -= _end_point
_row_s -= _end_point
_row_s, _row_e = find_point(data_file, 'a4', 'gt', df, _row_s, _row_e, w2t, 4, threshold=5, step=_step, end_point=_end_point)
elif speed_avg > 2: elif speed_avg > 2:
# 过滤尾部非零无效数据 # 过滤尾部非零无效数据
_row_s, _row_e = find_point(data_file, 'b1', 'gt', df, _row_s, _row_e, w2t, 2, threshold=5, step=_step, end_point=_end_point) _row_s, _row_e = find_point(data_file, 'b1', 'gt', df, _row_s, _row_e, w2t, 1, threshold=5, step=_step, end_point=_end_point)
# 找到第一个起始点 row_end继续找到有数据的部分后面有一段零数据区 # 找到第一个起始点 row_end继续找到有数据的部分后面有一段零数据区
row_end = _row_e - _adjust row_end = _row_e - _adjust
_row_e -= _end_point _row_e -= _end_point
_row_s -= _end_point _row_s -= _end_point
_row_s, _row_e = find_point(data_file, 'b2', 'lt', df, _row_s, _row_e, w2t, 4, threshold=5, step=_step, end_point=_end_point) _row_s, _row_e = find_point(data_file, 'b2', 'lt', df, _row_s, _row_e, w2t, 2, threshold=5, step=_step, end_point=_end_point)
# 目前已经有一点的速度值了,继续往前搜寻下一个速度为零的点 # 目前已经有一点的速度值了,继续往前搜寻下一个速度为零的点
_row_e -= _end_point _row_e -= _end_point
_row_s -= _end_point _row_s -= _end_point
_row_s, _row_e = find_point(data_file, 'b3', 'gt', df, _row_s, _row_e, w2t, 4, threshold=5, step=_step, end_point=_end_point) _row_s, _row_e = find_point(data_file, 'b3', 'gt', df, _row_s, _row_e, w2t, 3, threshold=5, step=_step, end_point=_end_point)
row_start = _row_s + _adjust row_start = _row_s + _adjust
data = [] data = []
for row in range(row_start, row_end): for row in range(row_start, row_end):
data.append(df.iloc[row, 0]) data.append(df_origin.iloc[row, 0])
data.append(df.iloc[row, 1]) data.append(df_origin.iloc[row, 1])
data.append(df_origin.iloc[row, 2])
i = 0 i = 0
for row in ws.iter_rows(min_row=2, min_col=2, max_row=150000, max_col=3): for row in ws.iter_rows(min_row=2, min_col=2, max_row=150000, max_col=4):
for cell in row: for cell in row:
try: try:
_ = f"{data[i]:.2f}" _ = f"{data[i]:.2f}"
@ -311,6 +321,7 @@ def p_scenario(wb, single, vel, trq, rpms, rrs, dur, w2t):
col_names = list(df.columns) col_names = list(df.columns)
df_1 = df[col_names[vel-1]].multiply(rpm*addition) df_1 = df[col_names[vel-1]].multiply(rpm*addition)
df_2 = df[col_names[trq-1]].multiply(scale) df_2 = df[col_names[trq-1]].multiply(scale)
df_origin = df
df = concat([df_1, df_2], axis=1) df = concat([df_1, df_2], axis=1)
row_start = 300 row_start = 300
@ -320,11 +331,11 @@ def p_scenario(wb, single, vel, trq, rpms, rrs, dur, w2t):
data = [] data = []
for row in range(row_start, row_end): for row in range(row_start, row_end):
data.append(df.iloc[row, 0]) data.append(df_origin.iloc[row, 0])
data.append(df.iloc[row, 1]) data.append(df_origin.iloc[row, 1])
i = 0 i = 0
for row in ws.iter_rows(min_row=2, min_col=2, max_row=150000, max_col=3): for row in ws.iter_rows(min_row=2, min_col=2, max_row=250000, max_col=3):
for cell in row: for cell in row:
try: try:
_ = f"{data[i]:.2f}" _ = f"{data[i]:.2f}"

View File

@ -13,7 +13,7 @@ def find_point(bof, step, pos, data_file, flag, df, row, w2t):
# flag: greater than or lower than # flag: greater than or lower than
if flag == 'gt': if flag == 'gt':
while 0 < row < df.index[-1]-100: while 0 < row < df.index[-1]-100:
_value = df.iloc[row, 2] _value = float(df.iloc[row, 2])
if _value > 2: if _value > 2:
if bof == 'backward': if bof == 'backward':
row -= step row -= step
@ -33,7 +33,7 @@ def find_point(bof, step, pos, data_file, flag, df, row, w2t):
row_target = row + 100 row_target = row + 100
elif flag == 'lt': elif flag == 'lt':
while 0 < row < df.index[-1]-100: while 0 < row < df.index[-1]-100:
_value = df.iloc[row, 2] _value = float(df.iloc[row, 2])
if _value < 2: if _value < 2:
if bof == 'backward': if bof == 'backward':
row -= step row -= step
@ -61,14 +61,19 @@ def get_cycle_info(data_file, df, row, step, w2t):
# 1. 从最后读取数据无论是大于1还是小于1都舍弃找到相反的值的起始点 # 1. 从最后读取数据无论是大于1还是小于1都舍弃找到相反的值的起始点
# 2. 从起始点,继续往前寻找,找到与之数值相反的中间点 # 2. 从起始点,继续往前寻找,找到与之数值相反的中间点
# 3. 从中间点,继续往前寻找,找到与之数值相反的结束点,至此,得到了高低数值的时间区间以及一轮的周期时间 # 3. 从中间点,继续往前寻找,找到与之数值相反的结束点,至此,得到了高低数值的时间区间以及一轮的周期时间
if df.iloc[row, 2] < 2: # print(f"row = {row}")
# print(df.iloc[row, 2])
if float(df.iloc[row, 2]) < 2:
row = find_point('backward', step, 'a1', data_file, 'lt', df, row, w2t) row = find_point('backward', step, 'a1', data_file, 'lt', df, row, w2t)
_row = find_point('backward', step, 'a2', data_file, 'gt', df, row, w2t) _row = find_point('backward', step, 'a2', data_file, 'gt', df, row, w2t)
_row = find_point('backward', step, 'a3', data_file, 'lt', df, _row, w2t) _row = find_point('backward', step, 'a3', data_file, 'lt', df, _row, w2t)
row_end = find_point('backward', step, 'a4', data_file, 'gt', df, _row, w2t) row_end = find_point('backward', step, 'a4', data_file, 'gt', df, _row, w2t)
# print(f"row_end = {row_end}")
row_middle = find_point('backward', step, 'a5', data_file, 'lt', df, row_end, w2t) row_middle = find_point('backward', step, 'a5', data_file, 'lt', df, row_end, w2t)
# print(f"row_middle = {row_middle}")
row_start = find_point('backward', step, 'a6', data_file, 'gt', df, row_middle, w2t) row_start = find_point('backward', step, 'a6', data_file, 'gt', df, row_middle, w2t)
# print(f"row_start = {row_start}")
return row_end-row_middle, row_middle-row_start, row_end-row_start return row_end-row_middle, row_middle-row_start, row_end-row_start
@ -95,13 +100,14 @@ def preparation(data_file, wb, w2t):
begin = int(row[1]) begin = int(row[1])
break break
df = read_csv(data_file, sep=',', encoding='gbk', skip_blank_lines=False, header=begin - 1, on_bad_lines='warn') df = read_csv(data_file, sep=',', encoding='gbk', skip_blank_lines=False, header=begin - 1, on_bad_lines='warn')
# print(f"df = {df}")
low, high, cycle = get_cycle_info(data_file, df, df.index[-1]-110, 5, w2t) low, high, cycle = get_cycle_info(data_file, df, df.index[-1]-110, 5, w2t)
return ws, df, low, high, cycle return ws, df, low, high, cycle
def single_file_proc(ws, data_file, df, low, high, cycle, w2t): def single_file_proc(ws, data_file, df, low, high, cycle, w2t):
_row = _row_lt = _row_gt = count = 1 _row = _row_lt = _row_gt = count = count_i = 1
_step = 5 _step = 5
_data = {} _data = {}
row_max = df.index[-1]-100 row_max = df.index[-1]-100
@ -110,19 +116,29 @@ def single_file_proc(ws, data_file, df, low, high, cycle, w2t):
if count not in _data.keys(): if count not in _data.keys():
_data[count] = [] _data[count] = []
_value = df.iloc[_row, 2] _value = float(df.iloc[_row, 2])
if _value < 2: if _value < 2:
_row_lt = find_point('forward', _step, 'c'+str(_row), data_file, 'lt', df, _row, w2t) _row_lt = find_point('forward', _step, 'c'+str(_row), data_file, 'lt', df, _row, w2t)
# print(f"_row_lt = {_row_lt}")
_start = int(_row_gt + (_row_lt - _row_gt - 50) / 2) _start = int(_row_gt + (_row_lt - _row_gt - 50) / 2)
# print(f"_start = {_start}")
_end = _start + 50 _end = _start + 50
# print(f"_end = {_end}")
# print("========================================\n")
value = df.iloc[_start:_end, 2].mean() + 3 * df.iloc[_start:_end, 2].std() value = df.iloc[_start:_end, 2].mean() + 3 * df.iloc[_start:_end, 2].std()
if value > 1:
w2t(f"{data_file} 文件第 {count} 轮 第 {count_i} 个数据可能有问题,需人工手动确认,确认有问题可删除,无问题则保留", 0, 0, 'orange')
_data[count].append(value) _data[count].append(value)
count_i += 1
else: else:
_row_gt = find_point('forward', _step, 'c'+str(_row), data_file, 'gt', df, _row, w2t) _row_gt = find_point('forward', _step, 'c'+str(_row), data_file, 'gt', df, _row, w2t)
# print(f"_row_gt = {_row_gt}")
if _row_gt - _row_lt > cycle * 2: if _row_gt - _row_lt > cycle * 2:
count += 1 count += 1
count_i = 1
_row = max(_row_gt, _row_lt) _row = max(_row_gt, _row_lt)
# print(f"_row = {_row}")
for i in range(2, 10): for i in range(2, 10):
ws.cell(row=1, column=i).value = f"{i-1}次测试" ws.cell(row=1, column=i).value = f"{i-1}次测试"

View File

@ -61,6 +61,7 @@ def run_rl(path, config_file, data_all, hr, md, w2t):
clibs.execution('state.switch_motor_on', hr, w2t, tab_name) clibs.execution('state.switch_motor_on', hr, w2t, tab_name)
# 3. 开始运行程序 # 3. 开始运行程序
clibs.execution('rl_task.set_run_params', hr, w2t, tab_name, loop_mode=True, override=1.0)
clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['current']) clibs.execution('rl_task.run', hr, w2t, tab_name, tasks=['current'])
_t_start = time() _t_start = time()
while True: while True:

0
main.py Normal file
View File