The biggest danger is voice imitation

The biggest danger is voice imitation

Click on the headings below to easily access the related contents in the The biggest danger is voice imitation field.

Üsküdar University Institute of Natural Sciences Deputy Director, Distance Education Application and Research Center (ÜSUZEM) Director Dr. Faculty Member Nuri Bingöl made evaluations on social media applications and privacy of personal data, which have been frequently discussed in recent days.

One of the most talked about topics of recent days is social media applications and personal data security. Stating that WhatsApp actually uses user data, experts said that the contract recently presented to users aims to move the application to legal grounds. Experts also point out the points to be considered when using such applications, especially pointing out that personal information should not be shared. Experts emphasize that the main danger is the collection of personal biological data such as voice, rather than obtaining location information and visuals.

Dr. Faculty Member Nuri Bingöl, Deputy Director of Üsküdar University Institute of Science and Technology and Director of Distance Education Application and Research Center (ÜSUZEM), made evaluations on social media applications and privacy of personal data, which have been frequently discussed in recent days.

Prof. Dr. Nuri Bingöl: "WhatsApp was already using the data"

Noting that the WhatsApp application has used personal data before, but this time it offers a contract, Dr. Lecturer Nuri Bingöl said, "You are using the location application. You open the location information from WhatsApp and send it to the other party. Although the data you send to the other party is encrypted end-to-end, whatsApp can use it. It is also possible to use it for its own benefit. This is where the problem arises. WhatsApp was already using our data. Now it has sent a contract that says 'Allow me, I am already using this data, now I will use it more clearly'."

Prof. Dr. Faculty Member Nuri Bingöl: "WhatsApp wants to put its application on a legal basis"

Stating that WhatsApp wants to put its application on a legal basis with this contract, Dr. Faculty Member Nuri Bingöl said, "It wants to establish a legal order according to itself and sell this data to third companies. Their own companies were using it for Facebook and Instagram applications. The purpose of using them is for commercial purposes. There was marketing and advertising. Now it will be able to sell it to third parties, to governments. Artificial intelligence applications are already being used by the governments of many states, especially the US, for some terrorist events. Therefore, this personal information will be given to them. Moreover, it was already expected that a company with a bad track record like Facebook would do this. However, by putting it on a legal basis, it will be able to present it to the court by saying 'I used your data, you accepted my contract, this is evidence' in case of a possible trial in the future."
Stating that users do not feel safe due to the contract in question, Bingöl said that many users have switched to other alternative applications due to the contract. He pointed out that a migration has started.

Prof. Dr. Faculty Member Nuri Bingöl: "WhatsApp's contract is a double standard"

Stating that WhatsApp's application of this agreement to our country is a double standard, Dr. Faculty Member Nuri Bingöl said, "There is a different application to European countries and a different application to us. Many experts attribute this to the weakness of our legal ground, but it is not. The reason for this is that the European Union has high purchasing power, is a better market and cannot give up the European Union."

Prof. Dr. Nuri Bingöl: "They should have offices in Turkey"

Prof. Dr. Faculty Member Nuri Bingöl stated that it is necessary for social media applications and platforms such as Facebook, Twitter, YouTube to have an office in Turkey and said, "The Digital Transformation Office of the Presidency of the Republic of Turkey wanted these companies to open offices in Turkey. Consumer sites started to open their offices here, including sales sites. Our biggest problem is that we cannot hold an organization that does not have legal representation here accountable. If they want to do business in our country, they must have offices here and be subject to our laws."

Prof. Dr. Faculty Member Nuri Bingöl: "New opportunities may arise for local applications"

Prof. Dr. Nuri Bingöl underlined that many challenges in this field can also create opportunities for the emergence of new domestic applications. Bingöl said, "We have now developed our applications, we have domestic applications, we have domestic companies. We can switch to applications very easily. It has already been available for a long time. There are tried, tested, secure platforms that do not keep and process certain data."

Assoc. Prof. Dr. Türker Tekin Ergüzel: "The biggest danger is that our voice can be imitated"

Üsküdar University Software Department Head Assoc. Prof. Dr. Türker Tekin Ergüzel stated that the sharing of personal data is important and pointed out that the most important danger here is the copying of the voice. Assoc. Prof. Dr. Türker Tekin Ergüzel said the following:

"When it comes to WhatsApp, the first thing that comes to mind is that it is a written communication area. Communication over text is the first thing that comes to mind. But do you know what is really valuable? There are applications based on evidence learning algorithms. Your voice that you send over WhatsApp is singular. It is unique to you. It is unique to you with voice processing algorithms. They can now create voice imitation algorithms. I'm talking about a mobile application or an algorithm from an application that speaks with your tone of voice. This is the real danger. Rather than our texts and the pictures we share, I am talking about a singular data belonging to us, which is voice data. On the back side, they get your location information, they get your images, but on the other hand, they also collect biological data specific to you, such as voice, which is really valuable. With the processing of that data, these algorithms will learn your voice and try to create an algorithm that evokes the feeling of speaking like you, expresses like you, and matches the tone of voice with your voice. That's where the real danger lies. Data analysis through big data. WhatsApp says, I am providing this data to Facebook, I don't have Facebook. It will be providing it to the big data embargo, not to you. That will be a subject with big question marks for later."

Stating that WhatsApp does not impose such an imposition on European Union countries, Assoc. Prof. Dr. Türker Tekin Ergüzel said, "European Union countries are not signing this text. It is an imposition on developing countries."

Stating that when a message is sent to the other party in the end-to-end encryption system, the message sent can only be opened on the other side, Assoc. Prof. Dr. Türker Tekin Ergüzel said, "Since the data is transferred in between, no source can access the message you send. This can be audio data or image data. No one can reach it. There is an encryption between the sender and the receiver, it works with the decryption algorithm and actually guarantees the security of the data."

Assoc. Prof. Dr. Türker Tekin Ergüzel: "Attention should be paid to individual security"

Referring to the points to be considered when using such applications, Assoc. Prof. Dr. Türker Tekin Ergüzel said, "Regardless of which application is used, what should always be considered is the protection of your private information. In accordance with the protection of personal data, this information should not be shared. We will protect yourself. Your personal data should not be on your computers and phones. It is important to ensure individual security and to pay attention to what will be stored on the phones."

Share
CreatorNP Istanbul Hospital Editorial Board
Updated At05 March 2024
Created At14 January 2021
Let Us Call You
Phone