*** Deepfake Offence Proposal Returned for Further Study | THE DAILY TRIBUNE | KINGDOM OF BAHRAIN

Deepfake Offence Proposal Returned for Further Study

A deepfake proposal was withdrawn from debate in the Shura Council on Sunday and sent back to committee for further study, with Dr Ebtesam Al Dallal casting the lone vote against the move.

Members agreed to return the draft to the Foreign Affairs, Defence and National Security Committee for a fresh review before it comes back to the chamber at a later sitting.

The proposal would add a new Article 10 bis to Law No. 60 of 2014 on Information Technology Crimes. It would punish anyone who produces or falsifies audio or visual material using an information technology system, then shares, publishes, transmits, distributes, sends or makes it available in a way that could make another person a target for contempt or punishment, violate honour, tarnish families’ reputations, or serve an unlawful purpose. The draft sets a fine of between 3,000 and 10,000 dinars and allows for imprisonment, with courts able to impose either penalty or both.

The proposal was submitted by Ali Al Shihabi, First Deputy Chairman Jamal Fakhro, Dr Mohammed Ali Hassan, Khalid Al Maskati and Dalal Al Zayed. The committee had urged members not to pass it in its current form, saying existing criminal laws already cover the harm caused by fabricated audio and video and raising concerns about proof and practical enforcement.

Speaking for the committee, rapporteur Ali Al Aradi said the draft aimed to criminalise the use of information technology systems to produce or falsify audio or visual material and then circulate it in ways that harm others. He said the committee’s view was that the criminal texts already in force are enough to deal with misuse, without creating a new offence tied to a single technique.

The committee described deepfakes as content made using artificial intelligence and machine learning to create, alter or combine audio or video so that a real person or event is made to appear real but untrue, in a form that can be hard to spot, with words or acts attributed to someone other than their true source.

In the chamber, the proposers argued that the absence of recorded cases was not a reason to wait. Al Shihabi told members: ‘Saying it doesn’t exist in Bahrain does not mean we wait for it to happen.’

He argued that relying only on general Penal Code offences leaves unclear points when the conduct exists only through information technology systems, and pointed to the Information Technology Crimes Law as an example of Bahrain using specific texts for specific conduct. He also cited a case in the United Arab Emirates that he said involved $35 million being stolen through voice-changing technology.

Al Shihabi urged members to let the idea move forward so the government could study it and, if it agrees, return with a government bill. ‘I ask the council to give this idea a chance to enter the legislative cycle,’ he said.

Dr Mohammed Ali Hassan, one of the proposers, said the aim was to punish people who use deepfake technology to produce audio or video and then publish or distribute it, with harmful results that can spread quickly. He argued that the current legal texts do not deal clearly with the technique named in the proposal and urged members to allow it to go to the government for review.

Dr Abdulaziz Abul also backed giving the draft more time, telling the chamber that new technology is spreading quickly and can be misused, and that a clear penalty could deter those tempted to falsify audio and video.

The Ministry of Justice, Islamic Affairs and Waqf backed the same line. It told the committee there was no gap in the law and pointed to Penal Code provisions on offences against people and their reputations, alongside Article 23 of the Information Technology Crimes Law, as covering the conduct described in the draft. It also said the Public Prosecution classifies facts during investigations based on the circumstances of each case.

In its submission to the committee, the Ministry of Interior argued against creating a separate deepfake article now. It said no reported standalone cases had been recorded and that current laws already give prosecutors enough tools to deal with such cases. The ministry also warned that deepfake technology is still developing and that current detection methods do not provide the certainty needed for criminal proof, especially in complex videos. It said a new offence would also require specialist technical capability, extra staff and additional forensic checks that it did not see as needed at this stage.