PC in the Cloud? What challenges do streaming services face?

13-05-2022 | By Robin Mitchell

The demand for cloud streaming of games and powerful processors is becoming more popular, but many technical challenges still persist. Why is the concept of Cloud Computing not new, what challenges will new streaming services face, and will remote computing ever take over?


Why is the concept of Cloud Computing not new?


While the concept of cloud computing may seem new, its origins actually trace back to some of the first mainstream computers of the 1960s and 1970s. Before we can understand how remote computing became commonplace during this period, we first need to understand why computers were large and expensive.

For a computer to be practical, it needs to be able to perform calculations on real-world data that can be used in real-world applications. For example, the first military computers were used to compute artillery tables that would allow ships to accurately hit targets, while some of the first IBM machines were used to count tabulated data for the national census in the US.

Regardless of the technology used, computers typically find real-world practicality when having data busses of at least 16-bits, memories in the hundreds of MB, and instruction speeds in the millions per second. Now, that doesn’t mean that 8-bit machines cannot be used practically, as 8-bit machines dominated the home computer market for a time, but while they can be useful for home applications, they most certainly cannot be used for large-scale computing applications such as scientific research and transaction processing.

As such, even the first computers had to be scaled up quickly in order to be practical. For example, the IBM System 360, which launched in 1964, had 32-bit words, 24-bit address space, and could execute up to 16.6 MIPS. Such computers would be constructed from discrete components, and this would see their size scale up rapidly. Thus, it makes economic sense to build a large computer with decent capabilities that can be used for many purposes instead of a smaller system with reduced capabilities (as those reduced capabilities would be insufficient for demanding tasks).

However, the development of very large mainframe machines also made sense when engineers developed the concept of remote computing and time-sharing. It is significantly cheaper to have one large mainframe that can handle 100 users all using terminal machines (screen, keyboard, and a basic interface) than trying to give all 100 users their own dedicated machines.

This use of remote computing continued for several decades, whereby companies would purchase large mainframes that employees would log into using small desktop terminals. As computing power improved, these terminals were gradually replaced with desktop computers that could provide some degree of local processing while the mainframe provided users with additional processing power if needed. Some mainframes even allowed for remote access over telephone networks with the use of a modem, and this led to some companies renting out their spare processing power to others in time-sharing schemes.


What challenges will new streaming services face?


Many companies have made recent attempts to bring computing back into mainframes and datacentres. For example, Amazon Web Services provides online cloud computing services that are easily scalable with applications such that an application requiring more resources can be assigned them dynamically.

Another example would be Microsoft Azure which provides essentially the same services as Amazon Web Services. Google also provide cloud-based software solutions such as Google Docs and Google Sheets which are free browser versions of Word and Excel.

The advantages of cloud computing are numerous, whether it is accessible regardless of location, the ability to remove the need for powerful computers, the ability to work on multiple platforms, and the security offered by datacentres (unexpected loss of power won’t see data loss from the user’s end). Furthermore, cloud computing puts the responsibility of system up-keep on the datacentre, and the large number of users pooling their resources allows for access to hardware that would otherwise be too expensive to own (server-grade CPUs, high-end GPU etc.).

In fact, both Nvidia and Microsoft are exploring the idea of streaming computing resources. For example, Nvidia has a game streaming service called GeForce Now that allows subscribers to remotely play games on Nvidia hosted servers. Microsoft is also said to be developing their own Xbox anywhere service that removes the need for customers to purchase consoles and instead remotely access Xbox games on Microsoft servers.

However, remote computing services such as those being developed by Microsoft and offered by Nvidia face many challenges. The biggest challenge by far is the quality of the connection between the user and the server.

For non-intensive applications such as word processing, the quality of an internet connection is rarely an issue as the amount of data being exchanged between the client and server is minimal. An intensive application such as a game, however, not only requires high-quality video streaming but also low-latency input response.

Considering that the typical player can expect to see pings of between 20ms and 40ms on a good day when running game software locally, relying on a remote server to stream video and keystrokes could see this significantly worsen. Furthermore, the increased distance between a player and the streaming server will worsen this effect, and thus either the latency suffers or the video quality.

Thus, only those who have excellent connections between themselves and the server will be able to take advantage of such streaming services. Even then, such a service can only handle so many users at any one time, and it is highly likely that users of one geographical location will be gaming at the same time (i.e., after work, afternoon etc.). This could see performance degradation during peak times, further reducing video quality or increasing input latency.


Will remote computing ever take over?


Even though great strides have been made in the area of remote computing, it is doubtful that remote computing will become the norm for intensive applications. Everyday applications such as word processing will highly benefit from cloud computing as documents can be accessed anywhere at any time, but those who like to game will often invest energy and time into their gaming device. Furthermore, trying to support every game in existence will be difficult for servers to handle, and this could see remote gaming limited to a few popular titles while other smaller games continue to be played on mainstream PCs.

Subscription-based services, however, could make remote computing more popular as companies try to find alternative methods for generating revenue. By keeping all hardware and software on remote platforms, those looking to gain access must pay monthly fees, which can provide companies with a healthy revenue stream. Of course, subscription models are universally hated as they are seen as a way to prevent outright ownership of hardware and software, and abusing such a revenue model could see customers actively refuse to support remote computing solutions.

Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation, developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.