최근 찾은 한화시스템 경기 용인 연구소. 남태형 우주사업부 솔루션사업팀 과장이 현재 개발 중인 위성영상 분석 인공지능(AI) 시스템에 한 분쟁 지역의 공군 기지 영상을 띄우자 AI가 즉시 이를 분석하기 시작했다. 수십 초도 지나지 않아 화면 곳곳에 붉은색과 녹색 네모 칸이 생기고, 옆에는 영상에 잡힌 모든 지상의 물체에 대한 정보가 주르륵 나열됐다. 일부 물체는 어떤 ‘기종’인지까지 표시됐다. 해상도 1m급의 광학(EO) 위성이 촬영한 영상 정보를 AI가 분석해 스스로 판단한 것이다.
최근 한화시스템 용인연구소 위성 개발 클린룸에서 이경묵 위성탑재체팀 수석연구원이 정찰용 광학위성에 부착될 탑재체 핵심 부품의 오차 측정 장비를 점검하고 있다. 한화시스템은 고해상도 광학 및 합성개구레이더(SAR) 위성을 저궤도에 쏘아 올려 24시간 특정 지역을 감시하고 인공지능(AI)으로 분석하는 시스템을 개발하고 있다. 한화시스템 제공
● AI reads every object on the ground Hanwha Systems is developing an AI system that analyzes satellite imagery to extract required information in real time. The video used in the demonstration was past footage provided by an external satellite imagery company for AI training purposes, but the AI system itself is operating in “real-time mode.”
It is not limited to identifying types of aircraft. Any object above a certain size—such as vehicles, trains, ships, and buildings—can all be identified once the system is trained. Hanwha Systems aims to develop a satellite imagery AI interpretation system that can, for example, assess enemy troop strength using satellite data or calculate damage after a strike, and to bring it to a level suitable for deployment in the field by 2028.
The core factors are the number of satellites and their resolution. For the AI system to be operationally deployed, it must be able to closely observe a specific area at least every 30 minutes. High-resolution imagery is also essential to distinguish details such as small vehicles or unit movements. For this reason, Hanwha Systems is concurrently developing an ultra‑low‑Earth‑orbit synthetic aperture radar (SAR) satellite with 15 cm‑class resolution. A 15 cm‑class resolution means that a 15 cm object on the ground corresponds to one pixel (dot) in the satellite image. According to Hanwha Systems, this is “a resolution at which a water bottle on Earth can be identified from space.” The company assesses that deploying 64 of these satellites in a 350 km orbit would make practical military operation feasible.
AI not only “reads imagery” but also supports decision‑making. Satellite‑collected imagery of enemy troop dispositions is transmitted to a “mobile ground station” (a vehicle that commands operations) equipped with its own AI analysis system. The AI installed there immediately proposes to the on‑scene commander which weapons to use and how best to strike the area. It can, for instance, recommend preemptively devastating an anticipated enemy movement route to hinder operations, or selecting the optimal interceptor missile when an enemy missile attack is detected.
● Also active in disaster relief and emergency rescue
A view of Hanwha Systems’ satellite imagery analysis AI system performing resolution upscaling to sharpen a blurred image (left) into a clearer one (right). Provided by Hanwha Systems
Hanwha Systems expects that such a system can also be used for disaster mitigation and search‑and‑rescue purposes. Nam explained, “This is because the method for assessing damage in a disaster area is not much different from the way we analyze the scale of damage inflicted on the enemy by our military’s attacks on the battlefield.”
In fact, a satellite photograph of a hurricane‑damaged area in Illinois, United States, was displayed on the screen. When the phrase “Analyze the scale of damage” was entered into a nearby large language model (LLM—an AI model that can be instructed conversationally, like ChatGPT) chat window, thousands of small blue squares appeared on the satellite image a short time later. Each square represented one building. After also analyzing imagery of the same region before the damage, the AI system produced a calculation of “8% damage rate.” It indicated that there had been a total of 5,694 buildings in the satellite image taken before the hurricane, but the number remaining in the subsequent image was 5,236.
According to Hanwha Systems, AI analysis capabilities can be significantly enhanced when optical satellites and SAR satellites are used together to observe the Earth’s surface. Optical satellites can capture clear images but cannot conduct observations on cloudy days and have no penetration capability. SAR satellites offer lower clarity than optical satellites but can observe regardless of weather conditions and also have penetration capability. Operating the two types of satellites in tandem would make it possible to look into spaces beneath buildings or piles of earth in disaster zones, greatly increasing the efficiency of rescue operations and enabling faster formulation of disaster response plans.
Hanwha Systems stated that it expects “broad applications in the safety sector, such as monitoring changes in ice thickness to support the opening of Arctic sea routes, or tracking boundaries of sea level and flood‑prone areas to aid in disaster prevention.”
ⓒ dongA.com. All rights reserved. Reproduction, redistribution, or use for AI training prohibited.
Popular News