This page summarizes the main projects in which I have scientifically contributed significantly.
This project is funded by Horizon Europe, and it aims to "aim to improve the existing industry-established lean management approaches related to Reconfiguration Management through the digitalization of the production, characterized as Industry4.0, by allowing for information sharing between value chain stakeholders." In this aspect, a human assistant system is proposed for human-centric shop floor reconfiguration.
My Role: I participated in the proposal stage and am also supporting by contributing to the development of human assistive systems for extracting expert skills.
Project link: https://flex4res.eu/
Papers:
S. M. Raza, T. B. Tuli, and M. Manns, “Human Action Sequence Prediction for (Re)configuring Machine Tools,” Procedia CIRP, vol. 130, pp. 1170–1175, Jan. 2024, doi: 10.1016/j.procir.2024.10.223.
This project is funded by the German Research Foundation (DFG), and it aims to investigate the way in which human and robot motion models can be coupled and implemented using motion capture-driven approaches. In this context, latent space control approaches are investigated for the capability to follow the high mutual support and anticipation principle.
My Role: Coordinated the proposal writing stage and is also a member of the team during project implementation while conducting research regarding human motion behavior modeling and control approaches for human-robot collaboration scenarios.
Link: https://protech.mb.uni-siegen.de/fams/research/projects/en/
Papers:
Coming soon (under review) ...
This project was funded by EFRE, and the interactions between humans and machines, including robots, were investigated inside the project. In order to investigate human-machine dependencies, human movements are recorded using modern motion capture systems. With the help of the data obtained, movement models can be created that ensure improved human-machine interactions.
My Role: Conduct research regarding human motion behavior understanding, and the results have been published in several scientific outlets.
Project link: https://www.smaps.org/motion-capture-system/
Papers:
Tuli, T. B., Henkel, M., & Manns, M. (2022). Latent Space Based Collaborative Motion Modeling from Motion Capture Data for Human Robot Collaboration. Procedia CIRP, 107, 1180–1185. (DOI)
Tuli, T.B., Manns, M., and Jonek, M. (2022) Understanding Shared Autonomy of Collaborative Humans Using Motion Capture System for Simulating Team Assembly. In: A.-L. Andersen, R. Andersen, T.D. Brunoe, M.S.S. Larsen, K. Nielsen, A. Napoleone, and S. Kjeldgaard, eds. Towards Sustainable Customization: Bridging Smart Products and Manufacturing Systems. Cham: Springer International Publishing, 527–534. (DOI)
Jonek, M., Manns, M., and Tuli, T.B. (2021) Virtuelle Montageplanung mit Motion Capture Systemen/Virtual assembly planning with motion capture systems. wt Werkstattstechnik online, 111 (04), 256–259. 04 (DOI)
Manns, M., Tuli, T.B., and Schreiber, F. (2021) Identifying human intention during assembly operations using wearable motion capturing systems including eye focus. Procedia CIRP, 104, 924–929. (DOI)
Tuli, T.B., Kohl, L., Chala, S.A., Manns, M., and Ansari, F. (2021) Knowledge-Based Digital Twin for Predicting Interactions in Human-Robot Collaboration. In: 2021 26th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA). Västerås, Sweden.(DOI)
Frohn-Sörensen, P., Geueke, M., Tuli, T.B. et al. 3D printed prototyping tools for flexible sheet metal drawing. Int J Adv Manuf Technol 115, 2623–2637 (2021 (DOI)
Prediction of human activity and detection of subsequent actions is crucial for improving the interaction between humans and robots during collaborative operations. Deep-learning techniques are being applied to recognize human activities, including industrial applications. However, the lack of sufficient datasets in the industrial domain and complexities of some industrial activities such as screw driving, assembling small parts, and others affect the model development and testing of human activities. Recently, the InHard dataset (Industrial Human Activity Recognition Dataset) was published to facilitate industrial human activity recognition for better human-robot collaboration, but it still lacks extended evaluation. In this regard, we employ human activity recognition memory and sequential networks (HARNets), combining convolutional neural network (CNN) and long short-term memory (LSTM) techniques.
Link: https://github.com/tadeleTuli/HARNets
Paper:
Tuli, T.B., Patel, V.M., and Manns, M., 2022. Industrial Human Activity Prediction and Detection Using Sequential Memory Networks. In: CPSL 2022 - Conference On Production Systems And Logistics). (DOI)
MOSIM (Itea3 project) aims to develop and implement a generic concept inspired by the FMI standard, transferring the idea of co-simulating models from different simulation environments to the field of human simulation by introducing the Motion Model Units.
My Role: Modeling motion model units for apply pressure when pushing an object into a hole. The results are published in the following conference papers.
Papers:
Jonek, M., Tuli, T.B., and Manns, M., 2022. Constraints for Motion Generation in Work Planning with Digital Human Simulations. In: A.-L. Andersen, R. Andersen, T.D. Brunoe, M.S.S. Larsen, K. Nielsen, A. Napoleone, and S. Kjeldgaard, eds. Towards Sustainable Customization: Bridging Smart Products and Manufacturing Systems. Cham: Springer International Publishing, 567–574. (DOI)
Tuli, T.B., Manns, M., Zöller, C., and Klein, D., 2021. Path planning for simulating human motions in manual assembly operations. Procedia CIRP, 104, 930–934. (DOI)
Tuli, T.B. and Manns, M., 2022. Comparison of AI-based Task Planning Approaches for Simulating Human-Robot Collaboration. In: A.-L. Andersen, R. Andersen, T.D. Brunoe, M.S.S. Larsen, K. Nielsen, A. Napoleone, and S. Kjeldgaard, eds. Towards Sustainable Customization: Bridging Smart Products and Manufacturing Systems. Cham: Springer International Publishing, 158–165. (DOI)