Centralized cloud services were initially designed with a primary focus on storage and central processing capabilities. Such services excelled at tasks related to data processing, data storage, and various enterprise software applications. This centralized design was formulated as a response to the needs of businesses and the traditional software stack during the early expansion of cloud computing.
However, as technology has evolved, especially in the artificial intelligence and gaming industries, graphics processing unit (GPU) demands have grown significantly. GPUs are crucial for tasks such as rendering complex graphics, machine learning, and AI processing. Centralized cloud infrastructure was not originally designed to efficiently provide GPU resources, and this limitation has become increasingly apparent.
The inadequacies of the current cloud infrastructure can be summarized as follows:
- Cost: Centralized cloud infrastructure is costly to procure, scale, and maintain, requiring substantial upfront investment. Utilizing GPU resources from public cloud providers like Amazon or Microsoft Azure can also be prohibitively expensive, particularly when considering the long-term requirements of applications.
- GPU scarcity: There is a well-documented shortage of GPU resources in the public cloud, making it challenging to meet the demands of various industries.
- Latency: To reduce costs, service providers typically centralize their equipment within data centers instead of dispersing them across multiple locations. As a result, the performance of services varies based on the proximity to these data centers, with users enjoying lower latencies when they are closer to the data centers serving their applications, and vice versa.
Decentralized cloud infrastructure as a potential solution
Decentralized cloud infrastructure offers a more cost-effective alternative, distributing ownership of infrastructural hardware across independent service providers (nodes) rather than a centralized entity.
In this operating model, the service providers deploy resources based on their individual capabilities, keeping overhead expenses manageable while retaining scalability. Moreover, this decentralized model can respond more adeptly to dynamic market trends.
According to Mark Rydon, co-founder and CEO of Aethir, a decentralized cloud infrastructure provider, the key factor influencing latency is the lack of incentive to cluster hardware together. By adopting a decentralized model featuring a distributed network of smaller, independent nodes—activated as needed across various regions—the likelihood of users receiving services from a local node proximal to their location is substantially higher, resulting in improved latencies.
“Local service delivery is more cost-effective than regional or international delivery. As the network grows, Aethir gains speed, prioritizing local content delivery while also reducing costs. This makes Aethir’s approach highly effective in addressing latency,” Rydon said.
Aethir as a marketplace
Aethir operates as a GPU compute marketplace, facilitating the provision of services by GPU resource providers to companies seeking access to them.
However, the initial stage of scaling infrastructure is akin to the classic chicken-and-egg scenario: infrastructure providers prefer to collaborate with marketplaces that have a substantial user base, but marketplaces can’t attract enough users until they have sufficient infrastructure in place for them to join.
To address this challenge, Aethir raised USD 10 million to fund its GPU deployment and is currently in the final stages of securing a much larger USD 200 million GPU deployment fund.
“These funds have been instrumental in establishing the initial infrastructure stack, enabling users to join and jumpstart the GPU marketplace within our network,” Rydon said.
Simultaneously, Aethir is actively pursuing strategic partnerships with GPU resource providers in both the Web2 and Web3 sectors. To simplify the adoption, integration with Aethir’s infrastructure is kept straightforward, especially for cloud gaming services, which is a key vertical.
“Think of cloud gaming as having a gaming PC in the cloud where the game is installed. When someone plays the game, a video of the gameplay is captured, and this video feed is streamed to the user. The user experiences an interactive YouTube-like video right in front of them,” Rydon said.
While Aethir’s infrastructure is primarily geared toward PC and mobile gaming, it is actively working to establish compatibility with console-based titles as well. According to Rydon, Aethir’s goal is to integrate console-based titles by the second quarter of 2024.
“Developers view our infrastructure as a solution to expand their total addressable market and enter markets they previously didn’t have access to. … By reducing overall costs through our decentralized cloud infrastructure network, we’ve made it economically viable for game developers to host users in regions where they traditionally couldn’t while keeping it a profitable experience for them,” Rydon said.
With Aethir, users can access compatible cloud gaming services by downloading a “mini client” on their mobile devices, which is typically around 10–15 megabytes in size. This small-sized client is sufficient to plug the user into a cloud-hosted gaming experience, significantly smaller in file size than traditional game clients that are at least several gigabytes in size and heavily reliant on the device’s native processing power.
Moreover, Aethir offers multiple methods of integrating its infrastructure into games. Users can opt to access games instantly via clickable links, or alternatively play games in the cloud through platforms like Steam, eliminating the need to download any software.
Future outlook
While decentralized cloud infrastructure can provide notable benefits, it is technically complex and resource-intensive to manage, with only major tech companies like Google, Amazon, and Tencent possessing the resources needed to adopt this model. The nascency of this industry also presents a challenge in assembling teams with the required knowledge, according to Rydon.
Furthermore, network stability remains essential for cloud gaming experiences to feel seamless. This makes it difficult to implement decentralized cloud infrastructure in regions where internet connection is intermittent or frequently disrupted.
Cloud gaming is poised to transform the gaming landscape by obviating the need for users to invest heavily in hardware. This shift will be pivotal in bringing gamers on board who were previously excluded due to hardware-related limitations. However, the key to realizing this vision lies in effectively addressing the scalability challenges that lie ahead.