C2F: Coarse-to-Fine Vision Control System for Automated Microassembly

Author(s): Shashank Tripathi*, Devesh R. Jain, Himanshu D. Sharma.

Journal Name: Nanoscience & Nanotechnology-Asia

Volume 9 , Issue 2 , 2019

Become EABM
Become Reviewer

Graphical Abstract:


Abstract:

Introduction: In this paper, authors present the development of a completely automated system to perform 3D micromanipulation and microassembly tasks. The microassembly workstation consists of a 3 degree-of-freedom (DOF) MM3A® micromanipulator arm attached to a microgripper, two 2 DOF PI® linear micromotion stages, one optical microscope coupled with a CCD image sensor, and two CMOS cameras for coarse vision.

Methods: The whole control strategy is subdivided into sequential vision based routines: manipulator detection and coarse alignment, autofocus and fine alignment of microgripper, target object detection, and performing the required assembly tasks. A section comparing various objective functions useful in the autofocusing regime is included.

Results: The control system is built entirely in the image frame, eliminating the need for system calibration, hence improving speed of operation. A micromanipulation experiment performing pick-and-place of a micromesh is illustrated.

Conclusion: This demonstrates a three-fold reduction in setup and run time for fundamental micromanipulation tasks, as compared to manual operation. Accuracy, repeatability and reliability of the programmed system is analyzed.

Keywords: Micromanipulation, microassembly, automation, 3D, visual servoing, microengineering.

[1]
Van Brussel, H.; Peirs, J.; Reynaerts, D.; Delchambre, A. Reinhart, Roth, N.; Weck, M.; Zussman, E. Assembly of microsystems. CIRP An. Manufact. Technol., 2000, 49(2), 451-472.
[2]
Fluitman, J. Microsystems technology: Objectives. Sens. Actuators A Phys., 1999, 56(1-2), 151-166.
[3]
Wakayama, T.; Perry, A.C.; Zuccotti, M.; Johnson, K.R.; Yanagimachi, R. Full-term development of mice from enucleated oocytes injected with cumulus cell nuclei. Nature, 1998, 394(6691), 369-374.
[4]
Wilmut, I.; Schnieke, A.E.; Mcwhir, J.; Kind, A.; Campbell, K. Viable offspring derived from fetal and adult mammalian cells. Cloning Stem Cells, 2007, 9(1), 3-7.
[5]
Palmiter, R.; Brinster, R.; Hammer, R.; Trumbauer, M.; Rosenfeld, M.B.; Evans, R. Dramatic growth of mice that develop from eggs microinjected with metallothionein-growth hormone fusion genes. 1982. Biotechnology (Reading, Mass.), 1992, 300, 611-615.
[6]
Kroll, K.L.; Amaya, E. Transgenic xenopus embryos from sperm nuclear transplantations reveal fgf signaling requirements during gastrulation. Development, 1996, 122(10), 3173-3183.
[7]
Bohringer, K.F.; Fearing, R.S.; Goldberg, K.Y. Microassembly. Handbook of Industrial Robotics., , 2nd ed., 2007, pp. 1045-1066.
[8]
Dechev, N.; Ren, L.; Liu, W.; Cleghorn, W.L.; Mills, J.K. Development of a 6 degree of freedom robotic micromanipulator for use in 3d mems microassembly in Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006. IEEE, 2006, pp. 281-288.
[9]
Ren, L.; Wang, L.; Mills, J.K.; Sun, D. 3-d automatic microassembly by vision-based control in 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2007, pp. 297- 302.
[10]
Bilen, H.; Unel, M. Micromanipulation using a microassembly workstation with vision and force sensing in International Conference on Intelligent Computing. Springer, 2008, pp. 1164-1172.
[11]
Desai, J.P.; Pillarisetti, A.; Brooks, A.D. Engineering approaches to biomanipulation. Annu. Rev. Biomed. Eng., 2007, 9, 35-53.
[12]
Mattos, L.S.; Caldwell, D.G. A fast and precise micropipette positioning system based on continuous camera-robot recalibration and visual servoing in Automation Science and Engineering, 2009. CASE 2009. IEEE International Conference on. IEEE, 2009, pp. 609-614
[13]
Jayaram, K.; Joshi, S.S. Design and development of a vision-based micro-assembly system. J. Eng. Manufact, 2016, 230(6), 1164-1168.
[14]
Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst., 2017, 60(6), 1097-1105.
[15]
Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition Avaialble from., https://arxiv.org/abs/1409.1556
[16]
Cecil, J.; Kumar, M.B.R.; Lu, Y.; Basallali, V. A review of micro-devices assembly techniques and technology. Int. J. Adv. Manuf. Technol., 2016, 83(9-12), 1569-1581.
[17]
Giouroudi, I. Design of a microgripping system with visual and force feedback for mems applications. MEMS Sens. Actuat, 2006, 2006, 243-250.
[18]
Kim, B.; Kang, H.; Kim, D-H.; Park, G.T.; Park, J-O. Flexible microassembly system based on hybrid manipulation scheme. IEEE/RSJ International Conference on, vol. 2. IEEE, 2003, pp. 2061- 2066.
[19]
Dionnet, F.; Haliyo, D.S.; Regnier, ´.S. Autonomous micromanipulation using a new strategy of accurate release by rolling in Robotics and Automation, 2004. Proceedings. ICRA’04. 2004 IEEE International Conference on, vol. 5. IEEE, 2004, pp. 5019-5024
[20]
Komati, B.; Kudryavtsev, A.; Clevy, ´.C.; Laurent, G.; Tamadazte, B.A.; Lutz, P. Automated robotic microassembly of flexible optical components in Assembly and Manufacturing (ISAM), 2016 IEEE International Symposium on IEEE, 2016, pp. 93-98
[21]
Cassier, C.; Ferreira, A.; Hirai, S. Combination of vision servoing techniques and vr-based simulation for semi-autonomous microassembly workstation in Robotics and Automation, 2002. Proceedings. ICRA’02. IEEE International Conference on, vol. 2. IEEE, 2002, pp. 1501-1506.
[22]
Guo, J.; Guo, S. Design and characteristics evaluation of a novel vr-based robot-assisted catheterization training system with force feedback for vascular interventional surgery. Microsyst. Technol., 2017, 23(8), 3107-3116.
[23]
Ferreira, A.; Cassier, C.; Hirai, S. Automatic microassembly system assisted by vision servoing and virtual reality. IEEE/ASME Trans. Mechatron., 2004, 9(2), 321-333.
[24]
Cecil, J.; Jones, J. VREM: An advanced virtual environment for micro assembly. Int. J. Adv. Manuf. Technol., 2014, 72, 47-56.
[25]
Monferrer, A.; Bonyuet, D. Cooperative robot teleoperation through virtual reality interfaces in Information Visualisation 2002Proceedings on Sixth International Conference on IEEE, 2002, pp. 243-248.
[26]
Lining, S.; Fusheng, T.; Weibin, R.; Jiang, Z. A collision detection approach in virtual environment of micromanipulation robot. High Technol. Lett, 2005, 11(4), 371.
[27]
Tan, F.; Sun, L.; Rong, W.; Zhu, J.; Xu, L. Modeling of micromanipulation robot in virtual environment. Acta Metall. Sin., 2004, 17(2), 94-198.
[28]
Alex, J.; Vikramaditya, B.; Nelson, B.J. A virtual reality teleoperator interface for assembly of hybrid mems prototypes in Proceedings of DETC, vol. 98, no. 1998, 1998, pp. 13-16.
[29]
Lu, Y.; Cecil, J. An internet of things (iot)-based collaborative frame-work for advanced manufacturing. Int. J. Adv. Manuf. Technol., 2016, 84, 1141-1152.
[30]
Shen, F.; Wu, W.; Yu, D.; Xu, D.; Cao, Z. High-precision automated 3-d assembly with attitude adjustment performed by lmti and vision-based control. IEEE/ASME Trans. Mechatron., 2015, 20(4), 1777-1789.
[31]
Wang, L.; Ren, L.; Mills, J.K.; Cleghorn, W.L. Automated 3-d micrograsping tasks performed by vision-based control. IEEE Trans. Autom. Sci. Eng., 2010, 7(3), 417-426.
[32]
Ren, L.; Wang, L.; Mills, J.K.; Sun, D. Vision-based 2-d automatic micrograsping using coarse-to-fine grasping strategy. IEEE Trans. Ind. Electron., 2008, 55(9), 3324-3331.
[33]
Pertuz, S.; Puig, D.; Garcia, M.A. Analysis of focus measure operators for shape-from-focus. Pattern Recognit., 2013, 46(5), 1415-1432.


Rights & PermissionsPrintExport Cite as

Article Details

VOLUME: 9
ISSUE: 2
Year: 2019
Page: [229 - 239]
Pages: 11
DOI: 10.2174/2210681208666180119143039
Price: $58

Article Metrics

PDF: 17
HTML: 1