< img src = "/uploads/blogs/f2/2f/IB-1IM22222222EA2A2A97E.jpg" Alt = "Judge of the Armed Forces told about the use of artificial intelligence in justice"/> ~ ~ ~ ~ < P > within the fourth day of training of judges of the Supreme Anti -Corruption Court, organized by the National School of Judges of Ukraine in cooperation with the German Fund of International Legal Cooperation, the Supreme Court Judge in the Cassation Administrative Court & Nbsp; < Strong > Bernaziuk & nbsp; nbsp; made a report on the topic of & laquo; artificial intelligence in justice: risks of algorithmic bias and discrimination & raquo;.
< p > The speaker drew attention to Art. 16 of the Code of Judicial Ethics, which provides cases when the use of artificial intelligence technology is permissible. < P > Judge told about algorithmic bias as a systemic error in artificial intelligence or automated algorithms, which leads to unfair or discriminatory treatment of certain groups.~ ~ ~ < p >During his speech, Jan Bernaziuk introduced the listeners to the law of Aara. The judge explained that this is the principle formulated by the scientist Roy Amara, which reflects the tendency of society to first perceive new technologies with hype, and then, after disappointment, not to notice their deep influence on different areas.
~ ~ > 62 > < p > considered the judge and concept & laquo; black box & raquo; which in the AI means a situation where algorithms make a solution, but their internal process remains unclear to users and developers. It was also about the & laquo; the Centaur's dilemma & raquo; in justice. According to the speaker, this dilemma & ndash; call in determining the limits of liability between judge and si. If the judge certainly trusts the si, then he risks losing control, critical thinking and tolerance (biased decision). If you ignore the AI, it slows down justice and reduces its effectiveness.
< p >Jan Bernaziuk also outlined the concept of Moravek paradox in the field of AI, which describes the reverse complexity of tasks for humans and AI. For example, high -level intellectual tasks that seem difficult for people are easy to perform. Instead, basic sensory-motor skills & ndash; Extremely difficult for si.
< P > The speaker drew attention to the applications created on the basis of the AI and can be used in the process of exercising a person by the right to access the court and in the activity of the judiciary. Gave a judge and practical advice to help you formulate a correct Qi request.
< P > In addition, Jan Bernaziuk reminded the most important events of 2024 in the field of the AI. Among other things, it was a stated position of the Supreme Court on the use of AI participants of the trial (the decision of the Cassation Commercial Court in the Armed Forces of February 8, 2024 in case No. 925/25/200/22); position with & RSQUO; Judges of Judges of Ukraine on the use of the AI judges (Code of Judicial Ethics of September 18, 2024); Approval by the All -Ukrainian NGO & Laquo; Lariors of Ukraine & Raquo; and submission on December 26, 2024 by the National Professional Standard Qualification Agency & Laquo; Judge & Raquo; to the qualifications register.
< P > The presentation of Jan Bernaziuk can be found for Reference: & nbsp; https: //court.gov.ua/storage/portal/supreme/prezentacii_2025/125_ai_algorithmic_bias_discrimination_risks_bernaziuk.p