The Evolution of Computer Memory – From Semiconductors to Proteins

Semiconductor Memory

Conventional computer memory is known as “semiconductor memory” and was invented in 1968. It’s based on technology known as the “semiconductor” which was invented in 1947. Many semiconductors grouped together is called an “integrated circuit”, more commonly known as a “computer chip”. Examples of semiconductor memory include ROM, RAM and flash memory. A big advantage of computer RAM (main memory) is price; ram is inexpensive. The main disadvantage of RAM is volatility; when you turn your computer off, the contents of RAM are lost.

Molecular Memory

Molecular memory is the name of a technology that uses organic molecules to store binary data. The Holy Grail of this technology would be to use one molecule to store one bit. For the near future, it would be more realistic to expect to have systems that use large groups of molecules to represent a single bit. Different types of molecules have been researched, including protein molecules. A more precise name of a molecular memory system that uses protein molecules is Protein Memory. Other types of molecular memory would have more precise names derived from the types of molecules on which the technologies are based.

Protein Memory

In the mid-1990s, the development of a protein-based memory system was the project of Robert Birge – chemistry professor and director of the W.M. Keck Center for Molecular Electronics. He was assisted by Jeff Stuart, a biochemist and one of Birge’s graduate students. The protein molecule in question is called bacteriorhodospin. Purple in color, it exists in the microorganism halobacterium halobium which thrives in salt marshes where temperatures can reach 140F.

The protein undergoes a molecular change when subjected to light making it ideal for representing data. Each molecular change is part of a series of many different states known as the photocycle. There are three main states: the bR state, the O state and the Q state. The O state represents binary 0 and the Q state represents binary 1 while the bR or rest state is neutral. To survive the harsh conditions of a salt marsh, the protein must be incredibly stable, a critical factor if it is to be used for representing data.

While in the bR state, the protein is placed in a transparent vessel called a cuvette, measuring 1 x 1 x 2 inches. The cuvette is then filled with a gel. The protein is fixed in place by the solidification of the gel. 2 arrays of lasers – one red and one green – are used to read and write data while a blue laser is used for erasing.

Reading, Writing and Storage Capacity

We will start in the bR state of the photocycle. A group of molecules is targeted and hit by the green laser array, also known as the Paging lasers. These molecules are now in the O state which represents binary 0. The O state allows for 2 possible actions:

• Reading – done with the red laser array set at low intensity

• Writing a binary 1 – done with the red laser array set at high intensity which moves the molecules to the Q state

The Q state allows for 2 possible actions:

• Reading – done with the red laser array set at low intensity

• Erasing – done with the blue laser which moves the molecules back to the bR state

A bacteriorhodospin storage system is slow. Although molecules change states in microseconds (millionths of a second), it’s slow when compared to semiconductor memory which has an access time measured in nanoseconds. Unfortunately, the time required to actually perform a read or write is even greater, on the order of ten milliseconds (thousandths of a second). The data transfer rate on this type of storage device is also very slow – 10 MBps (MB per second). In theory, the 1 x 1 x 2 inch cuvette could hold 1 TB of data or roughly one trillion bytes. In reality, Birge managed to store 800 MB and was hoping to achieve a capacity of 1.3 GB (billion bytes). The technology proved itself to the point that NASA was exploring methods of improving the technology during space shuttle missions, which in fact resulted in higher storage densities.

Conclusion

Birge’s quest to build a protein-based memory system for a desktop computer was unsuccessful. Although Birge’s vision failed, the development of some form of molecular memory (possibly protein memory) for desktop computers, seems possible. Scientists have also continued to work on developing other ideas involving protein memory. One idea from 2006 was to apply a layer of bR proteins to the surface of DVDs to increase storage capacity, theoretically up to 50 TB (over 50 trillion bytes). A dual layer blu-ray disc has a capacity of 50 GB (over 50 billion bytes).

Smart Cities: Harnessing the Power of Technology in Local Government

The global economy is in constant turmoil, governments are challenged to provide levels of service that typically only private businesses can provide and disruptive technologies are transforming industry at a rapid pace. In this bold new era where change is unavoidable it is the innovative who are positioned to surely thrive.

In 2008, the global economy fell into recession; it was the most significant downturn since the Great Depression. Recovery has been a long and exasperating struggle; at times feeling like we are clinging to the edge of a cliff, desperately trying to hold on and weather-the-storm.

Some economists even predict we are on the verge of another global recession. Foreign and domestic factors are both significant contributors to the swelling pessimism; there is the unpredictability of Trump’s Tweets, the looming collapse of the Euro, the astonishingly low cost of crude oil, student loan debts suffocating young adults who are unable to contribute to economic growth and let’s not forget about the dreaded silver tsunami.

It’s fair to say the economic outlook for the United States and Canada is unsettling. As a result of financial uncertainty many government agencies across North America are seeing their budgets tightened, while expectations from constituents continue to rise. The public opinion towards government processes, civic services and budgetary spending has become progressively more volatile in recent years.

So, how does one do more with less? Well, when you consider that we live in a world where virtual reality, 3D printing, quadcopters, pocket-sized spectrometers, and self-driving cars are no longer simply things of science fiction, the answer may be staring us directly in the face.

There are a plethora of software solutions that assist with a wide range of government functions. The technologies available today disrupt the old ways of doing things; these are solutions that automate and streamline processes, compile and share information departmentally, increase citizen engagement and enable open governments.

Numerous communities across North America are seemingly transforming into smart cities overnight; governments are adopting innovative technologies as a means to become more efficient and generate new revenue. Even more uplifting is that these advancements in technology no longer simply cater to federal and state agencies, local governments are now consuming what has become repeatable and affordable technology solutions.

Optimism should not be confused with naivety. The challenges being confronted by governments are most certainly daunting and implementing change successfully warrants its own discussion, but regardless of these hurdles the solutions being developed out of necessity are truly inventive and inspiring. More and more government agencies seem eager to evolve, refusing to fear disruptive technology and instead utilizing it as an opportunity to prosper. Navigating through the complexities of this digital age is certainly no easy task, but in a world ripe for change to the innovative go the spoils.

Derek, an employee of the E-Government Consulting Firm zedIT Solutions, works in collaboration with colleagues and partners to weave through complexities of technology and help transform public sector clients into digitally enabled government agencies.

Role of Big Data and the Cloud in the Gaming Industry

Greater engagement of players is the key to increasing revenue and staying ahead of the competitors for every gaming company. Every click and player interaction with the game creates valuable data which is thoroughly analyzed by the gaming companies to ensure that players are continuously engaged and keep coming back for more.

As the gaming industry continues to grow and expand, the role of big data becomes more critical due to the accumulation of a large volume of data. Big Data takes into account every single interaction made by players with the game, storing a large volume of pure data ready to be analyzed. But the real challenge lies in making the best use of the collected data.

The global gaming industry is growing at a rapid pace each year and generating massive revenue. For this reason, the top gaming companies keep searching for new and unique ways of harnessing the best-in-class technologies to capture large portions of the market. Around 50 Tb of data per day is generated by more than 2 billion gamers in the world while around 150 Gb data per day is generated by social games. In such a scenario, the use of Big Data technology in the gaming industry doesn’t come as a surprise at all.

Gaming has become a key contributor to big data, and an effective BI system in the gaming industry allows companies to effectively arrive at conclusions regarding a gamers’ taste, levels of satisfaction, and spending patterns. This is achieved when the data collected from several external sources is analyzed against the stored historical data to provide a better gaming experience to players with uninterrupted play sessions.

Further, strategically implemented cloud-based services are proven to uniquely address all technological challenges faced by the gaming industry. Opting for cloud services from the leading cloud service providers in India is the best solution for companies that need terabyte-scale storage space and availability to the large volume of records for instant analysis at minimal long-term investment.

Understanding what drives each of the gamer segments to play for longer durations and keep coming back for more requires analyzing logs in combination with player data to identify common characteristics. This helps gaming companies improve their game and player experience on the basis of real player data feedback.

While it is true that Big Data technology and the cloud are vital for the gaming industry, it is extremely important to opt for cloud advisory services as well as big data services only from the leading big data and cloud service providers having globally recognized certified consultants who have rich experience in developing appropriate big data strategies and selecting the right-fit technology which is in alignment with the business needs of an enterprise.