zad jordan news –
The US Intelligence Advanced Research Projects Activity Agency’s SMART program, as a first phase, aims to enable automated detection of large-scale construction work as it occurs, by coordinating multiple satellite data, which is a more difficult task than simply to determine landmarks from above, given the large number of variables during For a long period of time, to be used at a later stage, using the established algorithms to monitor natural cyclical changes such as crop growth, up to the monitoring of natural and man-made variables.
The website of the American magazine “Wired” indicated in the study the use of a video film filmed for the city of Dubai, which is a compilation of satellite images, during time intervals of the modern city, hundreds of kilometers away in the air. . . The film looks at the scenery of Dubai, not as we know it today, but Dubai in 1984, and moves to 2003, where an island appears in the shape of a palm tree, and another island in 2007, and more buildings and roads, then by 2020 the coastline and the land around it will appear, and they have completely transformed thanks to the rise of the glamorous construction projects for which the city is known.
This video was a way to show long-term changes that can be seen using Google Earth data. The 38-year time record of Dubai’s development is based on archival material taken from a single site during a period when large-scale construction took place.
The website says that in those satellite imagery, retrospective looks easy, but what about the automatic detection of massive construction projects as they occur anywhere on earth, without knowing when and where a skyscraper may appear? In this case, it gets harder. The Agency for Advance Intelligence Research Projects Activity, through the “SMART” program, which stands for automated identification technology in the space machine, tries to do this by “coordinating” data from multiple satellites monitoring the Earth and asking the software to search through them for signs of change. , whether natural or man-made.
And while many want to use these images to understand what is happening on the ground, from spy agencies to climate scientists, insurance companies and rangers, the website reports that there is far more data from satellites than human analysts can detect, and to automate at least part of the analysis It removes some of the tedious tasks so people can have time to explain.
The program’s initial focus is on identifying and controlling heavy construction work, rather than merely locating objects from above. Spotting a construction site requires identifying many objects, changing the site, over time, and extracting a pattern from it.
“Smart is trying to figure out what all this means together,” the website quotes the agency’s program director, Jack Cooper. Construction is a reasonable test of this type of analysis, as it may look different in the forest than on the beach, or between silos and estates. The construction goes through several stages, and takes place over many years, and there is no single indicator that is clear to indicate this. For example, satellite imagery algorithms can identify all dump trucks in an area, but to determine the presence of heavy construction, SMART teams cannot build a detector for these heavy vehicles, as they often appear in places where no work is required. . Place Building, they travel on the highway, or stop at the homes of their owners. The software can not send an alert when plants change from green to brown, which may be due to the removal of trees on the site. “How all these pieces of the puzzle fit together over time is what the presence of heavy construction “That’s what makes it a challenge. It’s not just one thing, or just one change.”
The SMART program began in early 2021, when the Intelligence Advanced Research Projects Activity Agency awarded contracts to teams led by Accenture, BlackSky, Systems and Technology Research, Kitware and Applied Research Associates. Astra, and Intelligence Automation. Some work on construction monitoring, others on technical issues: the satellites do not see the world the same way, the location benefits, and each satellite has its own characteristics. The appearance of a green dot in a satellite may differ from another, and differ from day to day, due to the angle of the sun, the state of the atmosphere, and so on.
A team is working on solutions. Kitware tries to monitor construction work and “coordinate” the differences between satellite images, by linking the images to some parameters so that they can be compared and processed side by side.
Eventually, the SMART project completed the first phase, where the teams finished building algorithms to monitor construction work, and tested their capability against more than 100,000 satellite images, showing 90,000 square kilometers of areas taken between 2014 and 2021.
And it was a fierce struggle to prove which methods work best in compiling the various clues, which are illuminating indications of new construction work.
The images analyzed by the teams came from different groups of satellites: from the joint “Landsat” program between NASA and the United States Geological Survey, from the European Space Agency “Sentinel”, and from the companies “Maxar” and “Planet”. The teams’ programs attempted to identify construction work where they were located, avoid false positives where there was no construction, approve images of Dubai, and reject the Amazon rainforest. “It is essential that systems can handle these two cases, and everything in between,” Cooper explained.
Partner organizations such as the Johns Hopkins University Applied Physics Laboratory, NASA’s Doddard Space Flight Center and the US Geological Survey have reviewed the images to determine which points are true or false. By mid-spring, nearly 1,000 construction sites in 27 districts had been marked, keeping track of their progress over time. The teams ran the images through their software and delivered their results at the end of April.
For their part, Kitware engineers have trained their network on images, feature selection and connections between them. Their analysis used a combination of techniques, one of which is called material characterization, which analyzes the pixels to see if they detect concrete or soil, for example. The second is semantic segmentation, in the sense of defining which pixels in the images belong to what kind of thing, whether a “building”, “tree”, “new island” or “road”. “We have an integration method that teaches how these features fit together,” said Anthony Hughes, vice president of artificial intelligence at Kitware. This model contains another type of algorithm, a variety of machine learning called mutant. The variables take sequence data of images taken over a period of time at a place where construction takes place, and trace the links between them. Green spaces can disappear, whites grow, and the program learns context to extract meaning from the visual scene.
Accenture, in turn, approached the task in a different way: by rethinking the major training datasets, which are sometimes needed to teach software how to interpret a scene. These images, which are often several thousand, usually need to be identified by a person, before they can be entered into the AI, as examples to teach them how to recognize similar images. Although it may be suitable for small objects, such as photographs of cats, it is more difficult for a complex landscape, taken from above. “Think of all the things you can see from a picture of a city,” says Mark Bosch Ruiz, managing director and computer vision officer at Extension. It will take weeks for one to mark all those parts.
So, with the help of academics, the company is focusing on developing new technologies, visualizing things on the ground themselves and how they are changing. These techniques rely on a method called “unattended learning,” in which researchers give the neural network a large amount of unmarked data to learn patterns and characteristics that it can identify on its own. Extension, for example, took random snippets of the same satellite image, sent it to the network, and then asked, “Are these two areas of the same image, or different images? In this way,” says Bosch-Ruiz, the network learns the commonalities between the pixels of same image.Items and activities are grouped into categories and identified by different shapes.
Now that the teams have submitted their results, this time they will have to apply their algorithms to different use cases. “Can algorithms built to find construction work find crop growth?” asked Cooper. This is a big shift because it replaces slow, man-made changes with cyclical, natural environmental changes, he says. In the third phase, which will begin in early 2024, the remaining competitors will try to transform their work into what Cooper calls a “strong capacity”, into something that can monitor natural and man-made changes.
According to the website, this will allow SMART teams to continue using the civilian algorithms, and the datasets created by the Advanced Intelligence Research Projects Activity for its programs will often be made publicly available for use by other researchers.
Hughes believes that the lessons learned from software developed by Ketware for SMART will apply to the environmental sciences, both for marine protection, fisheries and deforestation. And Bosch Ruiz notes that the automatic interpretation of landscape change has clear implications for the study of climate change, of ice melting, coral bleaching and land degradation. By monitoring new construction, it can show the invasion of people, turning the forest into agricultural land or houses.