OpenAI CEO says AI to reshape society, acknowledges risks

    WORLD  19 March 2023 - 15:46

    The CEO behind the company that created ChatGPT believes artificial intelligence technology will reshape society as we know it.

    He believes it comes with real dangers, but can also be "the greatest technology humanity has yet developed" to drastically improve our lives, ABC News reports.

    "We've got to be careful here," said Sam Altman, CEO of OpenAI. "I think people should be happy that we are a little bit scared of this."

    Altman sat down for an exclusive interview with ABC News' chief business, technology and economics correspondent Rebecca Jarvis to talk about the rollout of GPT-4 -- the latest iteration of the AI language model.

    In his interview, Altman was emphatic that OpenAI needs both regulators and society to be as involved as possible with the rollout of ChatGPT — insisting that feedback will help deter the potential negative consequences the technology could have on humanity. He added that he is in "regular contact" with government officials.

    ChatGPT is an AI language model, the GPT stands for Generative Pre-trained Transformer.

    Released only a few months ago, it is already considered the fastest-growing consumer application in history. The app hit 100 million monthly active users in just a few months. In comparison, TikTok took nine months to reach that many users and Instagram took nearly three years, according to a UBS study.

    Though "not perfect," per Altman, GPT-4 scored in the 90th percentile on the Uniform Bar Exam. It also scored a near-perfect score on the SAT Math test, and it can now proficiently write computer code in most programming languages.

    GPT-4 is just one step toward OpenAI's goal to eventually build Artificial General Intelligence, which is when AI crosses a powerful threshold which could be described as AI systems that are generally smarter than humans.

    Though he celebrates the success of his product, Altman acknowledged the possible dangerous implementations of AI that keep him up at night.

    "I'm particularly worried that these models could be used for large-scale disinformation," Altman said. "Now that they're getting better at writing computer code, [they] could be used for offensive cyberattacks."

    A common sci-fi fear that Altman doesn't share: AI models that don't need humans, that make their own decisions and plot world domination.

    "It waits for someone to give it an input," Altman said. "This is a tool that is very much in human control."

    However, he said he does fear which humans could be in control. "There will be other people who don't put some of the safety limits that we put on," he added. "Society, I think, has a limited amount of time to figure out how to react to that, how to regulate that, how to handle it."

    President Vladimir Putin is quoted telling Russian students on their first day of school in 2017 that whoever leads the AI race would likely "rule the world."

    "So that's a chilling statement for sure," Altman said. "What I hope, instead, is that we successively develop more and more powerful systems that we can all use in different ways that integrate it into our daily lives, into the economy, and become an amplifier of human will."

    Concerns about misinformation

    According to OpenAI, GPT-4 has massive improvements from the previous iteration, including the ability to understand images as input. Demos show GTP-4 describing what's in someone's fridge, solving puzzles, and even articulating the meaning behind an internet meme.

    This feature is currently only accessible to a small set of users, including a group of visually impaired users who are part of its beta testing.

    But a consistent issue with AI language models like ChatGPT, according to Altman, is misinformation: The program can give users factually inaccurate information.

    "The thing that I try to caution people the most is what we call the 'hallucinations problem,'" Altman said. "The model will confidently state things as if they were facts that are entirely made up."

    The model has this issue, in part, because it uses deductive reasoning rather than memorization, according to OpenAI.

    "One of the biggest differences that we saw from GPT-3.5 to GPT-4 was this emergent ability to reason better," Mira Murati, OpenAI's Chief Technology Officer, told ABC News.

    "The goal is to predict the next word – and with that, we're seeing that there is this understanding of language," Murati said. "We want these models to see and understand the world more like we do."

    "The right way to think of the models that we create is a reasoning engine, not a fact database," Altman said. "They can also act as a fact database, but that's not really what's special about them – what we want them to do is something closer to the ability to reason, not to memorize."

    Altman and his team hope "the model will become this reasoning engine over time," he said, eventually being able to use the internet and its own deductive reasoning to separate fact from fiction. GPT-4 is 40% more likely to produce accurate information than its previous version, according to OpenAI. Still, Altman said relying on the system as a primary source of accurate information "is something you should not use it for," and encourages users to double-check the program's results.

    Precautions against bad actors

    The type of information ChatGPT and other AI language models contain has also been a point of concern. For instance, whether or not ChatGPT could tell a user how to make a bomb. The answer is no, per Altman, because of the safety measures coded into ChatGPT.

    "A thing that I do worry about is ... we're not going to be the only creator of this technology," Altman said. "There will be other people who don't put some of the safety limits that we put on it."

    There are a few solutions and safeguards to all of these potential hazards with AI, per Altman. One of them: Let society toy with ChatGPT while the stakes are low, and learn from how people use it.

    Right now, ChatGPT is available to the public primarily because "we're gathering a lot of feedback," according to Murati.

    As the public continues to test OpenAI's applications, Murati says it becomes easier to identify where safeguards are needed.

    "What are people using them for, but also what are the issues with it, what are the downfalls, and being able to step in [and] make improvements to the technology," says Murati. Altman says it's important that the public gets to interact with each version of ChatGPT.

    "If we just developed this in secret -- in our little lab here -- and made GPT-7 and then dropped it on the world all at once ... That, I think, is a situation with a lot more downside," Altman said. "People need time to update, to react, to get used to this technology [and] to understand where the downsides are and what the mitigations can be."

    Regarding illegal or morally objectionable content, Altman said they have a team of policymakers at OpenAI who decide what information goes into ChatGPT, and what ChatGPT is allowed to share with users.

    "[We're] talking to various policy and safety experts, getting audits of the system to try to address these issues and put something out that we think is safe and good," Altman added. "And again, we won't get it perfect the first time, but it's so important to learn the lessons and find the edges while the stakes are relatively low."

    Will AI replace jobs?

    Among the concerns of the destructive capabilities of this technology is the replacement of jobs. Altman says this will likely replace some jobs in the near future, and worries how quickly that could happen.

    "I think over a couple of generations, humanity has proven that it can adapt wonderfully to major technological shifts," Altman said. "But if this happens in a single-digit number of years, some of these shifts ... That is the part I worry about the most."

    But he encourages people to look at ChatGPT as more of a tool, not as a replacement. He added that "human creativity is limitless, and we find new jobs. We find new things to do."

    The ways ChatGPT can be used as tools for humanity outweigh the risks, according to Altman.

    "We can all have an incredible educator in our pocket that's customized for us, that helps us learn," Altman said. "We can have medical advice for everybody that is beyond what we can get today."

    ChatGPT as "co-pilot"

    In education, ChatGPT has become controversial, as some students have used it to cheat on assignments. Educators are torn on whether this could be used as an extension of themselves, or if it deters students' motivation to learn for themselves.

    "Education is going to have to change, but it's happened many other times with technology," said Altman, adding that students will be able to have a sort of teacher that goes beyond the classroom. "One of the ones that I'm most excited about is the ability to provide individual learning -- great individual learning for each student."

    In any field, Altman and his team want users to think of ChatGPT as a "co-pilot," someone who could help you write extensive computer code or problem solve.

    "We can have that for every profession, and we can have a much higher quality of life, like standard of living," Altman said. "But we can also have new things we can't even imagine today -- so that's the promise."


    Subscribe to our Telegram channel

Read also

Serbian Defense Minister: Tensions may turn into armed conflict with Kosovo

28 May 2023 - 16:49

AI plus for jobs, but needs "precision regulation": IBM chief

28 May 2023 - 19:00

Egypt files another lawsuit against Netflix’s controversial "Queen Cleopatra"

28 May 2023 - 20:01

Elon Musk's Social Security number found in "Tesla files" leaked by whistleblower

28 May 2023 - 15:26

Zelenskyy proposes imposing 50-year sanctions against Iran

28 May 2023 - 15:50

CNN: Rishi Sunak could still suffer similar fate to Boris Johnson

28 May 2023 - 14:32
Latest news

    Egypt files another lawsuit against Netflix’s controversial "Queen Cleopatra"

    28 May 2023 - 20:01

    AI plus for jobs, but needs "precision regulation": IBM chief

    28 May 2023 - 19:00

    Erdogan urges supporters to "guard ballot boxes"

    28 May 2023 - 18:53

    Azerbaijani president attends groundbreaking ceremonies in liberated Lachin


    28 May 2023 - 18:47

    Azerbaijani president: No serious obstacles for signing peace deal with Armenia


    28 May 2023 - 18:33

    Second round of voting in Türkiye's presidential elections ends

    28 May 2023 - 18:16

    Israeli president to visit Azerbaijan soon

    28 May 2023 - 18:14

    Israeli envoy congratulates Azerbaijan on 105th anniversary of independence

    28 May 2023 - 18:05

    Northern Cyprus congratulates Azerbaijan on Independence Day


    28 May 2023 - 17:58

    Istanbul investigates spreading of disinformation during presidential election

    28 May 2023 - 17:46

    Minister: Turkish people always make right decisions

    28 May 2023 - 17:34

    Latvia says looking forward to developing friendly ties with Azerbaijan

    28 May 2023 - 17:22

    Turkish parliament speaker honours heroes who died for Azerbaijan's independence


    28 May 2023 - 17:10

    Dutch envoy congratulates Azerbaijan on Independence Day

    28 May 2023 - 16:58

    Serbian Defense Minister: Tensions may turn into armed conflict with Kosovo

    28 May 2023 - 16:49

    Russia intends to open new consulate in Armenia

    28 May 2023 - 16:37

    Turkish Defence Ministry congratulates Azerbaijan on Independence Day

    28 May 2023 - 16:26

    “Pashinyan getting on Moscow's nerves”

    Russian, Kazakh pundits on Caliber.Az

    28 May 2023 - 16:14

    Turkish ruling party MP congratulates Azerbaijan on Independence Day

    28 May 2023 - 16:02

    Zelenskyy proposes imposing 50-year sanctions against Iran

    28 May 2023 - 15:50

    Russian envoy warns Western arms supplies to Ukraine risk escalating war

    28 May 2023 - 15:38

    Elon Musk's Social Security number found in "Tesla files" leaked by whistleblower

    28 May 2023 - 15:26

    North Macedonia keen to develop bilateral ties with Azerbaijan

    28 May 2023 - 15:14

    Saudi Arabia, Russia ties under strain over oil-production cuts

    28 May 2023 - 15:02

    VP: New milestone begins in Türkiye’s history on path to regional stability

    28 May 2023 - 14:53

    UK envoy congratulates Azerbaijan on Independence Day

    28 May 2023 - 14:41

    CNN: Rishi Sunak could still suffer similar fate to Boris Johnson

    28 May 2023 - 14:32

    Kilicdaroglu votes in second round of Türkiye's presidential election

    28 May 2023 - 14:21

    Voter turnout quite high in Turkish elections, says Erdogan

    28 May 2023 - 14:15

    Azerbaijani president attends number of events in Lachin city

    28 May 2023 - 14:15

    Ex-presidential candidate hopes elections to be successful for Turkish people

    28 May 2023 - 14:09

    Armenian opportunistic "cleric" calls for popular rebellion against power

    Pargev the Bandit once again speaks out

    28 May 2023 - 13:57

    About 900 trucks pile up at Poland's border with Ukraine, Belarus

    28 May 2023 - 13:45

    China’s answer to Boeing, Airbus, C919 takes first commercial flight

    28 May 2023 - 13:33

    Erdogan votes in second round of presidential election

    28 May 2023 - 13:26

    Spokesman: Armenia must return detained Azerbaijani soldiers

    28 May 2023 - 13:21

    Turkish CEC head: Voters show high turn out in elections

    28 May 2023 - 13:05

    UN Secretariat circulates letter from West Azerbaijan Community as official document

    28 May 2023 - 12:58

    Turkish FM, Ukrainian envoy congratulate Azerbaijan on Independence Day

    28 May 2023 - 12:46

    New earthquake jolts Türkiye's Kahramanmaras

    28 May 2023 - 12:34

    Turkish FM: Regardless of outcome of elections, democracy to win

    28 May 2023 - 12:23

    Russian leader: Azerbaijan enjoys well-deserved prestige on world stage

    28 May 2023 - 12:11

    EU to continue close partnership with Azerbaijan


    28 May 2023 - 11:59

    Ukraine shoots down 52 Iranian kamikaze drones overnight

    28 May 2023 - 11:48

    Azerbaijani MFA shares publication on Independence Day


    28 May 2023 - 11:36

All news