هدف‌گیری با کمک هوش مصنوعی توسط اسرائیل در نوار غزه

به عنوان بخشی از جنگ اسرائیل و حماس ، نیروی دفاعی اسرائیل (IDF) از هوش مصنوعی برای انجام سریع و خودکار بسیاری از فرآیندهای تعیین بمباران استفاده کرده است. اسرائیل بمباران نوار غزه را که در جنگ های قبلی به دلیل تمام شدن اهداف نیروی هوایی اسرائیل محدود شده بود، بسیار گسترش داده است.

این ابزارها عبارتند از Gospel، یک هوش مصنوعی که به طور خودکار داده های نظارتی را به دنبال ساختمان ها، تجهیزات و افرادی که تصور می شود متعلق به دشمن هستند، بررسی می کند و پس از یافتن آنها، بمباران اهداف را به یک تحلیلگر انسانی توصیه می کند که ممکن است تصمیم بگیرد که نسبت به آن هدف، چه اقدامی لازم است انجام گیرد.

منتقدان استدلال کرده‌اند که استفاده از این ابزارهای هوش مصنوعی غیرنظامیان را در معرض خطر قرار می‌دهد، مسئولیت‌پذیری را محو می‌کند و به خشونت‌های نظامی نامتناسب منجر می‌شود که نقض قوانین بشردوستانه بین‌المللی است.

سیستم Habsora

[ویرایش]

اسرائیل از یک سیستم هوش مصنوعی به نام "Habsora" استفاده می کند تا مشخص کند که نیروی هوایی اسرائیل کدام اهداف را بمباران خواهند کرد.[۱] این به طور خودکار یک توصیه هدف‌گیری را به یک تحلیلگر انسانی ارائه می‌کند، [۲][۳] که تصمیم می‌گیرد آن داده‌ها را به سربازان در میدان منتقل کند یا خیر.[۳]

هوش مصنوعی می تواند اطلاعات را بسیار سریعتر از انسان پردازش کند.[۴][۵] ژنرال بازنشسته آویو کوهاوی ، رئیس ارتش اسرائیل تا سال 2023، اظهار داشت که این سیستم می تواند روزانه 100 هدف بمباران در غزه تولید کند.[۶] یک سخنران با مصاحبه NPR این ارقام را 50 تا 100 هدف در 300 روز برای 20 افسر اطلاعاتی و 200 هدف در 10 تا 12 روز برای Habsora تخمین زد.[۷]

گاردین گفتگویی را با چند تن از افسران اطلاعاتی ارتش اسرائیل منتشر کرد که می‌گوید، مردان فلسطینی مرتبط با شاخه نظامی حماس بدون در نظر گرفتن درجه یا اهمیت اهداف بالقوه در نظر گرفته می‌شوند، [۸] و اعضای رده پایین حماس ترجیحاً هدف قرار خواهند گرفت.[۹] دو تن از منابع گفتند که حملات به ستیزه جویان رده پایین معمولاً با بمب هدایت‌ناپذیر انجام می شود و کل خانه ها را ویران می کند و همه را در آنجا می کشد. یکی از این منابع، گفته است که اسرائیل، بمب‌های گران‌قیمت را برای افراد کم اهمیت، خرج نمی‌کند.[۱۰]

پیشینه فن آوری

[ویرایش]

هوش مصنوعی برخلاف نامش، توانایی تفکر یا هوشیاری را ندارد.[۱۱] بلکه سیستم‌هایی هستند که برای انجام وظایفی طراحی شده‌اند که معمولاً انسان‌ها با استفاده از توانایی‌های شناختی خود انجام می‌دهند. گاسپل از یادگیری ماشینی استفاده می کند[۱۲] که در آن به هوش مصنوعی مسئولیت تشخیص الگوها در مجموعه داده های گسترده (مانند تصاویر بافت های سرطانی، عکس هایی که حالات چهره را نشان می دهند، یا فیلم های نظارتی اعضای حماس که توسط تحلیلگران انسانی شناسایی شده اند) واگذار می شود و متعاقباً آن الگوها را در داده های جدید جستجو می کند.[۱۳]

منابع اطلاعاتی مورد استفاده گاسپل نامشخص است. با این حال، اعتقاد بر این است که داده های نظارتی گسترده از منابع مختلف را با هم ترکیب می کند.[۱۴] توصیه ها از تشخیص الگو به دست می آیند. فردی که شباهت های کافی را با افراد دیگری که به عنوان جنگجویان دشمن طبقه بندی می شوند نشان می دهد نیز ممکن است به عنوان یک جنگجو تعیین شود.[۱۲] Heidy Khlaaf، مدیر مهندسی AI Assurance در شرکت امنیت فناوری Trail of Bits، توسط NPR هوش مصنوعی در رابطه با مناسب بودن برای این کار مورد اشاره قرار گرفت. او اظهار داشت که «الگوریتم‌های هوش مصنوعی به‌طور دارای نقص ایمنی هستند و نرخ خطای بالایی را در برنامه‌هایی که نیاز به دقت، دقت و ایمنی دارند، نشان می‌دهند».[۷] بیانکا باگیارینی، مدرس مرکز مطالعات استراتژیک و دفاعی دانشگاه ملی استرالیا، اظهار داشت که هوش مصنوعی «در محیط‌های قابل پیش‌بینی که در آن مفاهیم عینی، به طور منطقی پایدار و از لحاظ درونی سازگار هستند، مؤثرتر هستند». او چالش تمایز قائل شدن بین جنگجویان و غیر جنگجویان را برجسته کرد، وظیفه ای که حتی انسان ها اغلب برای به انجام رساندن آن تلاش می کنند.[۱۵] خلف تأکید کرد که تصمیمات اتخاذ شده توسط چنین سیستمی کاملاً متکی به داده هایی است که بر اساس آن آموزش دیده است. این تصمیمات از استدلال، شواهد واقعی یا علت و معلول ناشی نمی شوند. بلکه منحصراً بر اساس احتمالات آماری هستند.[۱۶]

گاسپل توسط بخش مدیریت هدف ارتش، که به نام اداره اهداف[۳] یا اداره هدف[۱۷] نیز شناخته می شود، استفاده می شود که در سال 2019 در داخل اداره اطلاعات ارتش اسرائیل[۱۸] تأسیس شد. این ابتکار برای پرداختن به مسئله تخلیه گزینه های هدف نیروی هوایی ایجاد شد. کوهاوی این لشکر را «با قابلیت‌های هوش مصنوعی» و متشکل از صدها افسر و سرباز توصیف کرد.[۱۹] گاردین، فراتر از وظایف خود در زمان جنگ، گزارش داد که در سال‌های اخیر به ارتش اسرائیل در جمع‌آوری بانک اطلاعاتی از حدود 30000 تا 40000 شبه نظامی مظنون کمک کرده است. بعلاوه، سیستم‌هایی مانند گاسپل در ایجاد فهرست‌هایی از افرادی که مجاز به ترور تلقی می‌شوند، مؤثر بوده‌اند.[۱۸]

استفاده ۲۰۲۱

[ویرایش]

کوهاوی لشکر مورد نظر با استفاده از گاسپل را با یک ماشین مقایسه کرد و اظهار داشت که هنگامی که این دستگاه در جریان درگیری در ماه مه ۲۰۲۱ عملیاتی شد، روزانه ۱۰۰ هدف تولید می کرد که ۵۰ مورد از آنها درگیر بودند. این افزایش قابل توجهی را در مقایسه با نرخ قبلی ۵۰ هدف در سال در غزه نشان می دهد.[۲۰] طبق گزارش ارتش، حدود ۲۰۰ مورد از اهداف شناسایی شده توسط اسرائیل در جریان درگیری در غزه از گاسپل بدست آمده است،[۲۱] از مجموع ۱۵۰۰ هدفی که درگیر شده بودند. این رقم شامل اهداف ثابت و متحرک می شود.[۲۲] مؤسسه یهودی برای امنیت ملی آمریکا در گزارش پس از اقدام، مشکلی را شناسایی کرد و بیان کرد که این سیستم اطلاعاتی در مورد هدف مورد نظر دارد، اما فاقد اطلاعات در مورد آنچه نبوده است.[۲۳] این سیستم بطور انحصاری بر داده های آموزشی متکی است،[۱۶] و اطلاعاتی که تحلیلگران انسانی بررسی کرده بودند و تعیین کرده بودند که هدف نیستند، حذف شد و خطر سوگیری وجود داشت. معاون رئیس جمهور ابراز امیدواری کرد که این امر از آن زمان اصلاح شده باشد.[۲۲]

واکنش‌ها

[ویرایش]
  • آنتونیو گوترش ، دبیر کل سازمان ملل متحد، گفت که از گزارش‌ها مبنی بر استفاده اسرائیل از هوش مصنوعی در عملیات نظامی خود در غزه «عمیقاً ناراحت» است و گفت که این عمل غیرنظامیان را در معرض خطر قرار می‌دهد و مسئولیت‌پذیری را مخدوش می‌کند.[۲۴]
  • مارک اوون جونز، استاد دانشگاه حمد بن خلیفه ، در مورد سیستم هوش مصنوعی گفت: «بیایید واضح بگوییم: این یک نسل کشی با کمک هوش مصنوعی است، و در آینده، نیاز به توقف استفاده از هوش مصنوعی در جنگ وجود دارد.».[۲۵]
  • بن سائول ، گزارشگر ویژه سازمان ملل، اظهار داشت که اگر گزارش‌های مربوط به استفاده اسرائیل از هوش مصنوعی صحت داشته باشد، «بسیاری از حملات اسرائیل در غزه جنایت جنگی خواهند بود».[۲۶]

منابع

[ویرایش]
  1. Lee, Gavin (12 December 2023). "Understanding how Israel uses 'Gospel' AI system in Gaza bombings". France24. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
  2. Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. The Gospel is actually one of several AI programs being used by Israeli intelligence, according to Tal Mimran, a lecturer at Hebrew University in Jerusalem who has worked for the Israeli government on targeting during previous military operations. Other AI systems aggregate vast quantities of intelligence data and classify it. The final system is the Gospel, which makes a targeting recommendation to a human analyst. Those targets could be anything from individual fighters, to equipment like rocket launchers, or facilities such as Hamas command posts.
  3. ۳٫۰ ۳٫۱ ۳٫۲ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. A brief blog post by the Israeli military on November 2 lays out how the Gospel is being used in the current conflict. According to the post, the military's Directorate of Targets is using the Gospel to rapidly produce targets based on the latest intelligence. The system provides a targeting recommendation for a human analyst who then decides whether to pass it along to soldiers in the field.

    "This isn't just an automatic system," Misztal emphasizes. "If it thinks it finds something that could be a potential target, that's flagged then for an intelligence analyst to review."

    The post states that the targeting division is able to send these targets to the IAF and navy, and directly to ground forces via an app known as "Pillar of Fire," which commanders carry on military-issued smartphones and other devices.
    خطای یادکرد: برچسب <ref> نامعتبر؛ نام «NPR-blogpost» چندین بار با محتوای متفاوت تعریف شده است. (صفحهٔ راهنما را مطالعه کنید.).
  4. Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Algorithms can sift through mounds of intelligence data far faster than human analysts, says Robert Ashley, a former head of the U.S. Defense Intelligence Agency. Using AI to assist with targeting has the potential to give commanders an enormous edge.

    "You're going to make decisions faster than your opponent, that's really what it's about," he says.
  5. Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Militaries and soldiers frame their decision-making through what is called the “OODA loop” (for observe, orient, decide, act). A faster OODA loop can help you outmanoeuvre your enemy. The goal is to avoid slowing down decisions through excessive deliberation, and instead to match the accelerating tempo of war. So the use of AI is potentially justified on the basis it can interpret and synthesise huge amounts of data, processing it and delivering outputs at rates that far surpass human cognition.
  6. Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
  7. ۷٫۰ ۷٫۱ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. خطای یادکرد: برچسب <ref> نامعتبر؛ نام «NPR-article» چندین بار با محتوای متفاوت تعریف شده است. (صفحهٔ راهنما را مطالعه کنید.).
  8. McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. In the weeks after the Hamas-led 7 October assault on southern Israel, in which Palestinian militants killed nearly 1,200 Israelis and kidnapped about 240 people, the sources said there was a decision to treat Palestinian men linked to Hamas’s military wing as potential targets, regardless of their rank or importance.
  9. McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. The testimonies published by +972 and Local Call may explain how such a western military with such advanced capabilities, with weapons that can conduct highly surgical strikes, has conducted a war with such a vast human toll.

    When it came to targeting low-ranking Hamas and PIJ suspects, they said, the preference was to attack when they were believed to be at home. “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” one said. “It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
  10. McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. Two sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants. Attacks on such targets were typically carried out using unguided munitions known as “dumb bombs”, the sources said, destroying entire homes and killing all their occupants.

    “You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” one intelligence officer said. Another said the principal question they were faced with was whether the “collateral damage” to civilians allowed for an attack.

    “Because we usually carried out the attacks with dumb bombs, and that meant literally dropping the whole house on its occupants. But even if an attack is averted, you don’t care – you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”
  11. Mueller, John Paul; Massaron, Luca (2016). Machine Learning For Dummies®. Hoboken, New Jersey: John Wiley & Sons. ISBN 978-1-119-24551-3. p. 13: Machine learning relies on algorithms to analyze huge datasets. Currently, machine learning can’t provide the sort of AI that the movies present. Even the best algorithms can’t think, feel, present any form of self-awareness, or exercise free will.
  12. ۱۲٫۰ ۱۲٫۱ Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024. How does the system produce these targets? It does so through probabilistic reasoning offered by machine learning algorithms.

    Machine learning algorithms learn through data. They learn by seeking patterns in huge piles of data, and their success is contingent on the data’s quality and quantity. They make recommendations based on probabilities.

    The probabilities are based on pattern-matching. If a person has enough similarities to other people labelled as an enemy combatant, they too may be labelled a combatant themselves.
  13. Mueller, John Paul; Massaron, Luca (2016). Machine Learning For Dummies®. Hoboken, New Jersey: John Wiley & Sons. ISBN 978-1-119-24551-3. p. 33: The secret to machine learning is generalization. The goal is to generalize the output function so that it works on data beyond the training set. For example, consider a spam filter. Your dictionary contains 100,000 words (actually a small dictionary). A limited training dataset of 4,000 or 5,000 word combinations must create a generalized function that can then find spam in the 2^100,000 combinations that the function will see when working with actual data.
  14. Inskeep, Steve. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. The system is called the Gospel. And basically, it takes an enormous quantity of surveillance data, crunches it all together and makes recommendations about where the military should strike.
  15. Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Some claim machine learning enables greater precision in targeting, which makes it easier to avoid harming innocent people and using a proportional amount of force. However, the idea of more precise targeting of airstrikes has not been successful in the past, as the high toll of declared and undeclared civilian casualties from the global war on terror shows.

    Moreover, the difference between a combatant and a civilian is rarely self-evident. Even humans frequently cannot tell who is and is not a combatant.

    Technology does not change this fundamental truth. Often social categories and concepts are not objective, but are contested or specific to time and place. But computer vision together with algorithms are more effective in predictable environments where concepts are objective, reasonably stable, and internally consistent.
  16. ۱۶٫۰ ۱۶٫۱ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. The Israeli military did not respond directly to NPR's inquiries about the Gospel. In the November 2 post, it said the system allows the military to "produce targets for precise attacks on infrastructures associated with Hamas, while causing great damage to the enemy and minimal harm to those not involved," according to an unnamed spokesperson.

    But critics question whether the Gospel and other associated AI systems are in fact performing as the military claims. Khlaaf notes that artificial intelligence depends entirely on training data to make its decisions.

    "The nature of AI systems is to provide outcomes based on statistical and probabilistic inferences and correlations from historical data, and not any type of reasoning, factual evidence, or 'causation,'" she says.
  17. Leshem, Ron (30 June 2023). "IDF possesses Matrix-like capabilities, ex-Israeli army chief says". Ynetnews. Retrieved 26 March 2024.)
  18. ۱۸٫۰ ۱۸٫۱ Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. In early November, the IDF said “more than 12,000” targets in Gaza had been identified by its target administration division.

    The activities of the division, formed in 2019 in the IDF’s intelligence directorate, are classified.

    However a short statement on the IDF website claimed it was using an AI-based system called Habsora (the Gospel, in English) in the war against Hamas to “produce targets at a fast pace”.

    [...] In recent years, the target division has helped the IDF build a database of what sources said was between 30,000 and 40,000 suspected militants. Systems such as the Gospel, they said, had played a critical role in building lists of individuals authorised to be assassinated.
  19. Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. The target division was created to address a chronic problem for the IDF: in earlier operations in Gaza, the air force repeatedly ran out of targets to strike. Since senior Hamas officials disappeared into tunnels at the start of any new offensive, sources said, systems such as the Gospel allowed the IDF to locate and attack a much larger pool of more junior operatives.

    One official, who worked on targeting decisions in previous Gaza operations, said the IDF had not previously targeted the homes of junior Hamas members for bombings. They said they believed that had changed for the present conflict, with the houses of suspected Hamas operatives now targeted regardless of rank.
  20. Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. Aviv Kochavi, who served as the head of the IDF until January, has said the target division is “powered by AI capabilities” and includes hundreds of officers and soldiers.

    In an interview published before the war, he said it was “a machine that produces vast amounts of data more effectively than any human, and translates it into targets for attack”.

    According to Kochavi, “once this machine was activated” in Israel’s 11-day war with Hamas in May 2021 it generated 100 targets a day. “To put that into perspective, in the past we would produce 50 targets in Gaza per year. Now, this machine produces 100 targets a single day, with 50% of them being attacked.”
  21. Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. A report by the Israeli publication +972 Magazine and the Hebrew-language outlet Local Call asserts that the system is being used to manufacture targets so that Israeli military forces can continue to bombard Gaza at an enormous rate, punishing the general Palestinian population.

    NPR has not independently verified those claims, and it's unclear how many targets are currently being generated by AI alone. But there has been a substantial increase in targeting, according to the Israeli military's own numbers. In the 2021 conflict, Israel said it struck 1,500 targets in Gaza, approximately 200 of which came from the Gospel. Since October 7, the military says it has struck more than 22,000 targets inside Gaza — a daily rate more than double that of the 2021 conflict.

    The toll on Palestinian civilians has been enormous.
  22. ۲۲٫۰ ۲۲٫۱ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Misztal's group documented one of the first trials of the Gospel, during a 2021 conflict in Gaza between Israel and the militant groups Hamas and Islamic Jihad. According to press reports and statements from the military itself, Israel used the Gospel and other AI programs to identify likely targets such as rocket launchers. The system was used to identify static targets as well as moving targets as they appeared on the battlefield. According to press reports, it identified around 200 targets in the conflict.

    But it was not without its problems. The after-action report by Misztal's group noted that, while the AI had plenty of training data for what constituted a target, it lacked data on things that human analysts had decided were not targets. The Israeli military hadn't collected the target data its analysts had discarded, and as a result the system's training had been biased.

    "It's been two years since then, so it's something that, hopefully, they've been able to rectify," Misztal says.
  23. [۱]
  24. "UN chief 'deeply troubled' by reports Israel using AI to identify Gaza targets". France 24 (به انگلیسی). 2024-04-05. Retrieved 2024-04-06.
  25. "'AI-assisted genocide': Israel reportedly used database for Gaza kill lists". Al Jazeera. Retrieved 12 April 2024.
  26. "'AI-assisted genocide': Israel reportedly used database for Gaza kill lists". Al Jazeera. Retrieved 16 April 2024.