åǥÁö

[GT] 3Â÷¿ø ÀÔü À̹ÌÁö ±¸ÃàÀ» À§ÇÑ ¸ÞŸ¼­Çǽº ±â¼ú

¶óÀÌ´Ù¸¦ ÅëÇØ Â÷·®ÀÇ ¼Óµµ³ª ¹æÇâ»Ó¸¸ ¾Æ´Ï¶ó ÁÖº¯ ¹°Ã¼±îÁöÀÇ °Å¸®¸¦ ½Äº°ÇÒ ¼ö Àֱ⠶§¹®ÀÌ´Ù. Áï, ¿À´Ã³¯ ´ëºÎºÐ ÀÚµ¿Â÷ Á¦Á¶»çµéÀÇ ÀÚÀ² ÁÖÇà ÀÚµ¿Â÷¿¡¼­ ¶óÀÌ´Ù´Â..



[GT] 3Â÷¿ø ÀÔü À̹ÌÁö ±¸ÃàÀ» À§ÇÑ ¸ÞŸ¼­Çǽº ±â¼ú

By Gyeongtae Kim, NATURE COMMUNICATIONS, October 10, 2022

¶óÀÌ´Ù(LiDAR, light detection and ranging)´Â ¿µ¾î·Î ºû °¨Áö ¹× ¹üÀ§ ÁöÁ¤À̶õ ÀǹÌÀÇ ¾àÀÚÁö¸¸, ºû(light)¿Í ·¹ÀÌ´Ù(radar)¸¦ È¥ÇÕÇÏ¿© ¸¸µç ÇÕ¼º¾îÀ̱⵵ ÇÏ´Ù. Áï, ¶óÀÌ´Ù´Â ÀüÆÄ ´ë½Å¿¡ ºûÀ» ¾²´Â ·¹ÀÌ´Ù¸¦ ¶æÇϴµ¥, ÀüÅëÀû ·¹ÀÌ´Ù¿Í ¿ø¸®´Â °°Áö¸¸, »ç¿ëÇÏ´Â ÀüÀÚ±âÆÄÀÇ ÆÄÀåÀÌ ´Ù¸£¹Ç·Î ½ÇÁ¦ ÀÌ¿ë ±â¼ú°ú È°¿ë ¹üÀ§´Â ´Ù¸£´Ù.

¿À´Ã³¯ ¶óÀÌ´Ù´Â °Å¸®, ¼Óµµ, ¹æÇâ»Ó¸¸ ¾Æ´Ï¶ó ¿Âµµ¿Í ÁÖº¯ÀÇ ´ë±â ¹°Áú ºÐ¼® ¹× ³óµµ ÃøÁ¤ µî¿¡µµ ¾²ÀδÙ.

´õ±º´Ù³ª ¶óÀÌ´Ù´Â Àڿܼ±, °¡½Ã±¤¼±, ±ÙÀû¿Ü¼± µîÀ» »ç¿ëÇÏ¿© ±Ý¼Ó¼ºÀÎ ¾Æ´Ñ ¹ÙÀ§³ª ±¸¸§, ºø¹æ¿ï, ¿¡¾î·ÎÁ¹ µîµµ °¨ÁöÇÒ ¼ö ÀÖ´Ù. ÀÌ¿¡ ±â»ó °üÃø, Á¤¹Ð ÁöÇüµµ ÀÛ¼º¿¡µµ »ç¿ëµÈ´Ù. ¶ÇÇÑ ºÐÀÚ¸¶´Ù Àß »ê¶õ½ÃÅ°´Â ºûÀÇ ÆÄÀåÀÌ ´Ù¸¥ Çö»óÀ» È°¿ëÇÏ¿© °ø±â Áß¿¡ ¼¯¿© ÀÖ´Â ±âüÀÇ È­ÇÐÀû Á¶¼ºÀ» ¾Ë¾Æ³»´Âµ¥ »ç¿ëµÇ±âµµ ÇÑ´Ù.

¶óÀÌ´Ù´Â ¿À´Ã³¯ ÀÚµ¿Â÷ Á¦Á¶»çµéÀÌ »çÈ°À» °É°í ÀÖ´Â ¡®ÀÚÀ² ÁÖÇà ÀÚµ¿Â÷¡¯ ¿µ¿ª¿¡¼­µµ ÇÙ½É ÁßÀÇ ÇÙ½ÉÀÎ ±â¼úÀ̶ó ÇÒ ¼ö ÀÖ´Ù.

¶óÀÌ´Ù¸¦ ÅëÇØ Â÷·®ÀÇ ¼Óµµ³ª ¹æÇâ»Ó¸¸ ¾Æ´Ï¶ó ÁÖº¯ ¹°Ã¼±îÁöÀÇ °Å¸®¸¦ ½Äº°ÇÒ ¼ö Àֱ⠶§¹®ÀÌ´Ù. Áï, ¿À´Ã³¯ ´ëºÎºÐ ÀÚµ¿Â÷ Á¦Á¶»çµéÀÇ ÀÚÀ² ÁÖÇà ÀÚµ¿Â÷¿¡¼­ ¶óÀÌ´Ù´Â ¡®´«(eye)¡¯ÀÇ ¿ªÇÒÀ» ¼öÇàÇÑ´Ù.

»ç¹°À̳ª ¹°Ã¼¿¡ ºûÀ» Åõ»çÇÔÀ¸·Î½á ±× ´ë»óÀ» ÀνÄÇÏ´Â ¶óÀÌ´ÙÀÇ Æ¯¼º»ó, µµ·Î¿¡¼­ ¿¹ÃøÇÒ ¼ö ¾ø´Â »óȲÀ» °¨ÁöÇÏ°í ¹ÎøÇÏ°Ô ´ëÀÀÇÏ·Á¸é ¶óÀÌ´Ù ¼¾¼­°¡ ÀÚµ¿Â÷ÀÇ Àü¸é»Ó¸¸ ¾Æ´Ï¶ó, Ãø¸é°ú Èĸ鿡µµ ¼³Ä¡µÇ¾î °¨Áö ¿ªÇÒÀ» ¼öÇàÇØ¾ß ÇÑ´Ù.

ÇÏÁö¸¸ ¾ÈŸ±õ°Ôµµ Áö±Ý±îÁö´Â ±â¼úÀº ȸÀü½Ä ¶óÀÌ´Ù(LiDAR) ¼¾¼­¸¦ »ç¿ëÇß¾ú´Ù. ÀÌ´Â Å« ¹®Á¦¸¦ ¾ß±âÇÏ´Â µ¥, Â÷·®ÀÇ ÀüÈĹæÀ» µ¿½Ã¿¡ °üÂûÇÏ´Â °ÍÀÌ ºÒ°¡´ÉÇÏ´Ù´Â °ÍÀÌ´Ù.

À̸¦ ±Øº¹Çϱâ À§ÇØ ÇÑ ´ëÇѹα¹ ¿¬±¸ÆÀÀÌ 360µµ ½Ã¾ß°¢À» °®´Â °íÁ¤Çü ¶óÀÌ´Ù ¼¾¼­¸¦ °³¹ßÇß´Ù.

ÀÌ »õ·Î¿î ¼¾¼­´Â ¸Ó¸®Ä«¶ô ±½±âÀÇ 1000ºÐÀÇ 1¿¡ ºÒ°úÇÑ ÃʹÚÇü Æò¸é ±¤ÇмÒÀÚÀÎ ¸ÞŸ¼­ÆäÀ̽º(meta-surface)·Î ¸¸µé¾îÁ® ÃʼÒÇü ¶óÀÌ´Ù ½Ã½ºÅÛ ±¸ÇöÀ» °¡´ÉÇÏ°Ô ¸¸µé¾ú´Ù.

¸ÞŸ¼­Çǽº¸¦ È°¿ëÇÏ¸é ¶óÀÌ´ÙÀÇ ½Ã¾ß°¢À» Å©°Ô È®ÀåÇØ ¹°Ã¼¸¦ ÀÔüÀûÀ¸·Î ÀνÄÇÒ ¼ö ÀÖ´Ù.

ÀÌ ¿¬±¸ÆÀÀº ¸ÞŸ Ç¥¸éÀ» ±¸¼ºÇÏ´Â ³ª³ë±¸Á¶¸¦ ÁÖ±âÀûÀ¸·Î ¹è¿­ÇÏ°í µðÀÚÀÎÀ» ¼öÁ¤ÇØ ¶óÀÌ´Ù ¼¾¼­ÀÇ ½Ã¾ß°¢À» 360µµ·Î È®ÀåÇÏ´Â µ¥ ¼º°øÇß´Ù.

¿¬±¸ÆÀÀº ¸ÞŸǥ¸é¿¡¼­ 1¸¸ °³ ÀÌ»óÀÇ ºû Á¡ ¹è¿­À» ¹°Ã¼¿¡ »ê¶õ½ÃŲ ÈÄ, Á¶»çµÈ Á¡ ÆÐÅÏÀ» Ä«¸Þ¶ó·Î ÃÔ¿µÇÏ¿© 360¡Æ ¿µ¿ª¿¡¼­ ¹°Ã¼ÀÇ 3Â÷¿ø Á¤º¸¸¦ ÃßÃâÇÒ ¼ö ÀÖ¾ú´Ù.

ÀÌ·¯ÇÑ ÇüÅÂÀÇ ¶óÀÌ´Ù ¼¾¼­´Â ÀÌ¹Ì ¾ÆÀÌÆù(iPhone)ÀÇ ¾ó±¼ ÀÎ½Ä ±â´É¿¡ »ç¿ëµÇ°í ÀÖ´Ù.

±×·¯³ª ¾ÆÀÌÆùÀÌ Æ÷ÀÎÆ® ¼¼Æ®¸¦ »ý¼ºÇÏ´Â µ¥ »ç¿ëÇÏ´Â µµÆ® ÇÁ·ÎÁ§ÅÍ ÀåÄ¡¿¡´Â ¸î °¡Áö Á¦¾à »çÇ×ÀÌ ÀÖ´Ù. ƯÈ÷, Æ÷ÀÎÆ® ÆÐÅÏÀÇ ±ÕÀϼº ¹× ½Ã¾ß°¢ÀÌ Á¦Çѵǰí ÀåÄ¡ÀÇ Å©±â°¡ Å©´Ù.

Çѱ¹ÀÇ ¿¬±¸ÆÀÀº ±âÁ¸ ¸ÞŸ¼­Çǽº ¼ÒÀÚº¸´Ù ÇÑ ´Ü°è ¹ßÀüµÈ ±â¼úÀ» °³¹ßÇØ ¸ðµç °¢µµ¿¡¼­ ºûÀÇ ÀüÆĸ¦ Á¦¾îÇÒ ¼ö ÀÖÀ½À» ÀÔÁõÇÑ °ÍÀÌ´Ù.

ÀÌ »õ·Î¿î ±â¼úÀº ÃʼÒÇü ¹× Àüü °ø°£ 3D À̹Ì¡ ¼¾¼­ Ç÷§ÆûÀ» °¡´ÉÇÏ°Ô Çϴµ¥ ÀÌ´Â ¶óÀÌ´Ù ¾ÖÇø®ÄÉÀ̼ÇÀ» ´õ Àú·ÅÇÏ°í È¿°úÀûÀ¸·Î ¸¸µå´Â °Í°ú °°´Ù.

ÀÌ ¿¬±¸´Â ÃÖ±Ù ¡®³×ÀÌó Ä¿¹Â´ÏÄÉÀ̼ÇÁî(Nature Communications)¡¯ Àú³Î¿¡ °ÔÀçµÇ¾ú´Ù.

- NATURE COMMUNICATIONS, October 10, 2022; ¡°Metasurface-driven full-space structured light for three-dimensional imaging,¡± by Gyeongtae Kim, et al. © 2022 Springer Nature Limited. All rights reserved.

To view or purchase this article, please visit:
https://www.nature.com/articles/s41467022-32117-2
[GT] Metasurface-driven full-space structured light for three-dimensional imaging

By Gyeongtae Kim, NATURE COMMUNICATIONS, October 10, 2022

LiDAR functions as the eyes for most brands of autonomous vehicles by helping to identify the distance to surrounding objects as well as the speed or direction of the vehicle. It recognizes objects by projecting light onto them.

To detect unpredictable conditions on the road and nimbly respond, the LiDAR sensor must perceive the sides and rear as well as the front of the vehicle. 

However, until now, it has been impossible to observe the front and rear of the vehicle simultaneously because a rotating LiDAR sensor was used.

To overcome this issue, a Korean research team has developed a fixed LiDAR sensor that has a 360¡Æ view.

The new sensor can enable an ultra-small LiDAR system since it is made from a meta-surface, which is an ultra-thin flat optical device that is only one one-thousandth the thickness of a human hair strand.

Using the meta-surface can greatly expand the viewing angle of the LiDAR to recognize objects three-dimensionally.

The research team succeeded in extending the viewing angle of the LiDAR sensor to 360¡Æ by modifying the design and periodically arranging the nanostructures that make up the meta-surface.

It is possible to extract three-dimensional information of objects in 360¡Æ regions by scattering an array of more than 10,000 light dots from the meta-surface onto objects and then photographing the irradiated point pattern with a camera.

This type of LiDAR sensor is already used for the iPhone¡¯s face recognition function.

However, the dot projector device the iPhone uses to create the point sets has several limitations; specifically, the uniformity and viewing angle of the point pattern are limited, and the size of the device is large.

The team has proven that they can control the propagation of light in all angles by developing a technology more advanced than conventional meta-surface devices.

This new technology will enable an ultra-small and full-space 3D imaging sensor platform. And it¡¯s like to make LiDAR applications cheaper and more effective.

The research recently appeared in the journal in Nature Communications.
 
NATURE COMMUNICATIONS, October 10, 2022; ¡°Metasurface-driven full-space structured light for three-dimensional imaging,¡± by Gyeongtae Kim, et al. © 2022 Springer Nature Limited. All rights reserved.

To view or purchase this article, please visit:
https://www.nature.com/articles/s41467022-32117-2