News
OpenAI Establishes Five-Level System To Gauge AI Progress
The ChatGPT creator revealed the new classification system to employees during a recent company-wide meeting.
OpenAI has introduced a five-tier framework to monitor its advancement toward developing artificial intelligence that can rival and even surpass human capabilities.
The initiative is the latest in the startup’s efforts to enhance public understanding of AI safety and was shared with staff during a company-wide meeting on Tuesday, July 9. OpenAI intends to present the levels to investors and other stakeholders, which span from conversational AI (Level 1) to AI that can independently operate an entire organization (Level 5).
During the meeting, OpenAI executives informed employees that the company is currently at the first level but is nearing the second level, known as Reasoners. This tier represents AI systems capable of basic problem-solving tasks comparable to a human with a doctorate-level education.
In the same session, OpenAI’s leadership demonstrated a research project featuring the GPT-4 AI model, showcasing new skills indicative of human-like reasoning. For years, the company has been working towards achieving what is often referred to as artificial general intelligence (AGI), which entails creating computers that can outperform humans in most tasks. Such systems do not yet exist, though OpenAI CEO Sam Altman has previously suggested that AGI might be achievable later this decade.
Also Read: The Most AI-Proof Career Opportunities In The Middle East
Determining the criteria for AGI has been a topic of ongoing debate among AI researchers. In a paper published in November 2023, researchers at Google DeepMind proposed a framework of five ascending AI levels, including “expert” and “superhuman”, which resembles the classification system used in the automotive industry for self-driving cars.
According to OpenAI’s proposed levels, the third tier on the road to AGI is called Agents, representing AI systems that can perform tasks autonomously over several days. Level 4 describes AI that can generate new innovations, while the highest level, Organizations, refers to AI capable of managing entire enterprises.
The framework, developed by OpenAI executives and senior leaders, is considered a work in progress. The company plans to collect feedback from employees, investors, and its board, with the possibility of refining the levels over time.
News
Samsung Smart Glasses Teased For January, Software Reveal Imminent
According to Korean sources, the new wearable will launch alongside the Galaxy S25, with the accompanying software platform unveiled this December.
Samsung appears poised to introduce its highly anticipated smart glasses in January 2025, alongside the launch of the Galaxy S25. According to sources in Korea, the company will first reveal the accompanying software platform later this month.
As per a report from Yonhap News, Samsung’s unveiling strategy for the smart glasses echoes its approach with the Galaxy Ring earlier this year. The January showcase won’t constitute a full product launch but will likely feature teaser visuals at the Galaxy S25 event. A more detailed rollout could follow in subsequent months.
Just in: Samsung is set to unveil a prototype of its augmented reality (AR) glasses, currently in development, during the Galaxy S25 Unpacked event early next year, likely in the form of videos or images.
Additionally, prior to revealing the prototype, Samsung plans to introduce…
— Jukanlosreve (@Jukanlosreve) December 3, 2024
The Galaxy Ring, for example, debuted in January via a short presentation during Samsung’s Unpacked event. The full product unveiling came later at MWC in February, and the final release followed in July. Samsung seems to be adopting a similar phased approach with its smart glasses, which are expected to hit the market in the third quarter of 2025.
A Collaborative Software Effort
Samsung’s partnership with Google has played a key role in developing the smart glasses’ software. This collaboration was first announced in February 2023, with the device set to run on an Android-based platform. In July, the companies reiterated their plans to deliver an extended reality (XR) platform by the end of the year. The software specifics for the XR device are expected to be unveiled before the end of December.
Reports suggest that the smart glasses will resemble Ray-Ban Meta smart glasses in functionality. They won’t include a display but will weigh approximately 50 grams, emphasizing a lightweight, user-friendly design.
Feature Set And Compatibility
The glasses are rumored to integrate Google’s Gemini technology, alongside features like gesture recognition and potential payment capabilities. Samsung aims to create a seamless user experience by integrating the glasses with its broader Galaxy ecosystem, starting with the Galaxy S25, slated for release on January 22.