Toward optical quantum computing

Ordinarily, light particles — photons — don’t interact. If two photons collide in a vacuum, they simply pass through each other.

An efficient way to make photons interact could open new prospects for both classical optics and quantum computing, an experimental technology that promises large speedups on some types of calculations.

In recent years, physicists have enabled photon-photon interactions using atoms of rare elements cooled to very low temperatures.

But in the latest issue of Physical Review Letters, MIT researchers describe a new technique for enabling photon-photon interactions at room temperature, using a silicon crystal with distinctive patterns etched into it. In physics jargon, the crystal introduces “nonlinearities” into the transmission of an optical signal.

“All of these approaches that had atoms or atom-like particles require low temperatures and work over a narrow frequency band,” says Dirk Englund, an associate professor of electrical engineering and computer science at MIT and senior author on the new paper. “It’s been a holy grail to come up with methods to realize single-photon-level nonlinearities at room temperature under ambient conditions.”

Joining Englund on

Computer system predicts products of chemical reactions

When organic chemists identify a useful chemical compound — a new drug, for instance — it’s up to chemical engineers to determine how to mass-produce it.

There could be 100 different sequences of reactions that yield the same end product. But some of them use cheaper reagents and lower temperatures than others, and perhaps most importantly, some are much easier to run continuously, with technicians occasionally topping up reagents in different reaction chambers.

Historically, determining the most efficient and cost-effective way to produce a given molecule has been as much art as science. But MIT researchers are trying to put this process on a more secure empirical footing, with a computer system that’s trained on thousands of examples of experimental reactions and that learns to predict what a reaction’s major products will be.

The researchers’ work appears in the American Chemical Society’s journal Central Science. Like all machine-learning systems, theirs presents its results in terms of probabilities. In tests, the system was able to predict a reaction’s major product 72 percent of the time; 87 percent of the time, it ranked the major product among its three

New 3-D chip combines computing and data storage

As embedded intelligence is finding its way into ever more areas of our lives, fields ranging from autonomous driving to personalized medicine are generating huge amounts of data. But just as the flood of data is reaching massive proportions, the ability of computer chips to process it into useful information is stalling.

Now, researchers at Stanford University and MIT have built a new chip to overcome this hurdle. The results are published today in the journal Nature, by lead author Max Shulaker, an assistant professor of electrical engineering and computer science at MIT. Shulaker began the work as a PhD student alongside H.-S. Philip Wong and his advisor Subhasish Mitra, professors of electrical engineering and computer science at Stanford. The team also included professors Roger Howe and Krishna Saraswat, also from Stanford.

Computers today comprise different chips cobbled together. There is a chip for computing and a separate chip for data storage, and the connections between the two are limited. As applications analyze increasingly massive volumes of data, the limited rate at which data can be moved between different

HSA Connections

HSA Q&A with Dr. John Glossner

HSA computing standards have progressed significantly since the HSA Foundation (HSAF) was established in 2012. Today, for instance, there are not only royalty free open specifications available but also fully operational production systems.

Representatives from newly joined HSA Foundation members in China

Pictured: Representatives from newly joined HSA Foundation members in China

In this Q&A, Dr. John Glossner, HSA Foundation president, provides additional insights on HSA-specific trends and issues:

What are the connections/differences between heterogeneous computing, general purpose computing and specialized computing? If heterogeneous computing is the future, what will happen to general purpose computing and specialized computing?

General purpose computing is what you find in a CPU. It is meant to be able to process any function but streaming data, like artificial intelligence (AI), might not always be efficiently processed on a CPU.

Specialized computing would be a design made for one particular application such as AI but it would not be intended to run general purpose code (sometimes called control code). The specialized accelerator typically has the advantage that it is much lower power to execute the special purpose application (e.g., AI).

Careers at Emerging Technologies

For this ComputingEdge issue, we focus on emerging technologies as they relate to an increasing popular career transition for computing professionals—the shift from industry to academia. Prior to obtaining a full professorship in information systems at California State University Fullerton, Sorel Reisman held senior management positions at IBM, Toshiba, and EMI in the US and Canada. He served as 2011 IEEE Computer Society president and is currently a member of both the IEEE Publications Services and Products Board, as well as the IEEE Education Activities Board.
ComputingEdge: You spent considerable time in industry working for multinational companies, starting as an engineer and rising to vice president of development. Why did you leave for academia?
Reisman: When I finished graduate school, I fully intended to pursue an academic career. But academic positions were in short supply at the time, so I looked for a job in industry. And once there, I got caught up in the dynamic of raises and promotions. However, work in industry proved unstable, and unsuccessful company campaigns and projects encouraged me to change jobs several times.
Eventually, a friend who was a physics professor told me

The Real Future of Quantum Computing?

Instead of creating quantum computers based on qubits that can each adopt only two possible options, scientists have now developed a microchip that can generate “qudits” that can each assume 10 or more states, potentially opening up a new way to creating incredibly powerful quantum computers, a new study finds.

Classical computers switch transistors either on or off to symbolize data as ones and zeroes. In contrast, quantum computers use quantum bits, or qubits that, because of the bizarre nature of quantum physics, can be in a state of superposition where they simultaneously act as both 1 and 0.

The superpositions that qubits can adopt let them each help perform two calculations at once. If two qubits are quantum-mechanically linked, or entangled, they can help perform four calculations simultaneously; three qubits, eight calculations; and so on. As a result, a quantum computer with 300 qubits could perform more calculations in an instant than there are atoms in the known universe, solving certain problems much faster than classical computers. However, superpositions are extraordinarily fragile, making it difficult to work with multiple qubits.

Most attempts at building practical quantum computers rely on particles that serve as qubits. However, scientists have long known that they could in principle

Problems With Current Ticketing Systems

Ticketing systems (or issue tracking systems) are a convenient way to help your customers with tough problems, and help your development team find and address bugs faster. For example, you may use an email ticketing system to automatically notify your team when a user submits a potential issue; from there, you can have an individual address the issue, and mark it as resolved in a central database, along with notes on what they fixed (if they fixed anything) and how.
However, like all modern technologies, current ticketing systems aren’t perfect and can cause headaches if you aren’t prepared for their potential downsides.
Biggest Problems With Modern Ticketing Systems
These are some of the most common issues that development teams and customer service representatives face:

Documenting the ticket flow. Let’s say you have a new issue tracking system in place, and it automatically notifies everyone on your development team when there’s a ticket. What happens then? Is someone supposed to log into the platform and claim the issue as their own? Should there be a discussion over chat? If your ticket flow process isn’t clear, you’ll likely end up duplicating efforts or

Quantum Computing Secret

You may not need a quantum computer of your own to securely use quantum computing in the future. For the first time, researchers have shown how even ordinary classical computer users could remotely access quantum computing resources online while keeping their quantum computations securely hidden from the quantum computer itself.

Tech giants such as Google and IBM are racing to build universal quantum computers that could someday analyze millions of possible solutions much faster than today’s most powerful classical supercomputers. Such companies have also begun offering online access to their early quantum processors as a glimpse of how anyone could tap the power of cloud-based quantum computing. Until recently, most researchers believed that there was no way for remote users to securely hide their quantum computations from prying eyes unless they too possessed quantum computers. That assumption is now being challenged by researchers in Singapore and Australia through a new paper published in the 11 July issue of the journal Physical Review X.

“Frankly, I think we are all quite surprised that this is possible,” says Joseph Fitzsimons, a theoretical physicist for the Centre for Quantum Technologies at the National University of Singapore and principal investigator on the study. “There had been a number

Protecting Your Computer with Free Software

Q. Are those free PC antivirus programs safe to use?

A. The web is full of choices, but if you are looking for free protection for your computer, go with a program from an established security software company. You can find roundups and reviews online and the AV-Test.orgsite has a list of well-known software creators. Programs that pepper your screen with pop-ups or try to convince you that your computer is full of worms and viruses are often spyware or scams themselves.

Several companies offer free basic versions of their more complete security suites to home users — including Avast, AVG, Bitdefender, Sophos and ZoneAlarm. As the range of malicious software has expanded to other computing platforms, some companies now offer free tools for the Mac and mobile platforms as well; Malwarebytes Anti-Malware for Mac is among the options. Free apps that specifically protect against ransomware (like Bitdefender’s Anti-Ransomware Tool for Windows) can also be found.

When browsing for software, make sure you are actually getting a copy of the company’s free antivirus tool — and not just the free trial version of a more comprehensive paid program. Depending on the program, you

The Computer Memory Terminal

COMMUNITY MEMORY is the name we give to this experimental information service. It is an attempt to harness the power of the computer in the service of the community. We hope to do this by providing a sort of super bulletin board where people can post notices of all sorts and can find the notices posted by others rapidly.

We are Loving Grace Cybernetics, a group of Berkeley people operating out of Resource One Inc., a non-profit collective located in Project One in S.F. Resource One grew out of the San Francisco Switchboard and has managed to obtain control of a computer (XDS 940) for use in communications.

Pictured above is one of the Community Memory teletype terminals. The first was installed at Leopold’s Records, a student-run record store in Berkeley. The terminal connected by modem to a time-sharing computer in San Francisco, which hosted the electronic bulletin-board system. Users could exchange brief messages about a wide range of topics: apartment listings, music lessons, even where to find a decent bagel. Reading the bulletin board was free, but posting a listing cost a quarter, payable by the coin-op mechanism. The terminals offered many users their first interaction with a computer.

Adding New Fonts to the Computer

Q. I want to buy a new font online for my Mac, but how do I get it on my system?

A. The Mac operating system includes a utility called Font Book that you can use to add, remove and organize the fonts on your computer. You can find the program in your Mac’s Applications folder.

After you download a new typeface from an online font shop, double-click the file you received. Font Book should open automatically and display a sample alphabet or character set in the new font. Click the Install Font button at the bottom of the box to add the font to your Mac’s type library.

Font Book checks the fonts it installs to make sure there are no problems or incompatibilities with the new files. The program should also alert you if it finds duplicate fonts on the computer and fixes the issue for you. If you want Font Book to remove a font you no longer use, click All Fonts on the left side of the window and select the name of the typeface in the Fonts list. Go to the

Career related to internet

Florian Michahelles has run Siemens’ Web of Things research group—which investigates the application of Semantic Web technologies to the Internet of Things (IoT)—since 2013. Having worked in the fields of ubiquitous and wearable computing for more than a decade, Michahelles’ current focus at Siemens is leveraging Web and semantic technologies to enable new business opportunities, particularly in the fields of wearable sensing and human-robot interaction. He wrote “Internet of Things Reality Check” in IEEE Pervasive Computing’s April—June 2017 issue. We asked Michahelles about IoT-related careers.
ComputingEdge: What IoT-related careers will see the most growth in the next several years?
Michahelles: Any career bridging the disciplines of mechanical engineering, electrical engineering, design, computer science, interactive design, and communications will be in high demand because IoT reaches across these disciplines.
ComputingEdge: What would you tell college students to give them an advantage over the competition?
Michahelles: Go beyond your major and think about also taking non-tech majors, such as by combining computer science and psychology, business and electrical engineering, or material science and sensors.
ComputingEdge: What should applicants keep in mind when applying for IoT-related jobs?

Long term data storage

I’ve had a few people ask me just recently what method I would recommend when planning a long term backup strategy.  One elderly gentleman in particular was creating a family time capsule that he wanted his children and grandchildren to be able to view many decades from now.

The question isn’t as easy as you may think.  You may imagine that the data could be burnt to CD, locked in a cupboard and that it would last forever however unfortunately this isn’t the case.  There are literally hundreds of suitably stored but physically decayed CD’s from my teenage years which I could use as testament to that.

Therefore I’ve made a list of common formats one would usually consider for archiving a large amount of data so you can pick the most suitable one for your needs:

Hard Disk – When used on a regular basis a hard disk will typically last for around 5 years before it starts to decay and if it is being used as an infrequently accessed backup drive then we can assume that this can be at least doubled.   Unfortunately degradation of the discs metallic surface, along with the inevitable seizing

Long term data storage-SSD, Internet, Magneto Optical

Last week I spoke about a gentleman I met who was creating a family time capsule and had come to me to ask the most effective way of achieving data that he wished to be available past beyond his lifetime.

The question is an interesting one as when you look in to the technology available you realise that many forms of media are simply incapable of storing important data for more than a couple of years.   By way of example, a couple who videotape the early years of their child on to a DVD disc may be disappointed when ten years down the line the data has been destroyed by way of natural degradation of the media.

The last article already discussed the pros and cons of Hard Drives, Optical Media, Flash Drives, conventional Paper and Tape drives and so this week conclude with the remaining options I would consider:

Solid State Drive – An SSD uses solid-state memory (similar to that used in a flash drive) to store data and is most commonly used as a direct alternative to a hard drive, especially in notebooks where their small weight and size along with fast access times make them ideal.  Unfortunately, they suffer the

A Brief History of Wearable Computers

Gone are the days when a ‘compact computer’ filled an entire room or when a laptop required a chunky external battery to be considered as a ‘portable’ option – these days, most of us are walking around with smart-phones which have many hundreds of times the processing power of the Apollo lunar landing computers, but how far away are we from truly ‘wearable’ computing technology?

Roulette à la ‘James Bond’

The earliest example of a wearable electronic computer was devised by a mathematician in the 1960s, who developed a small counting machine, which was designed to predict the results of roulette spins; this required some cooperation between a group of users in order to be effective, with one data-gathering lookout transmitting the wheel spin speed data via electronic switches hidden inside their shoes; the data in question was a coded signal, consisting of musical notation was then sent to a better’s earpiece; this system proved to be outrageously effective when tested in some of the top casinos of the day in Las Vegas.

Moore’s law in full effect

The ‘cheating’ equipment from the 1960s evolved into more advanced shoe computers which seen active use throughout

Installing a graphics card in four easy steps

Specialist companies relish the opportunity to earn a bit of easy money from performing 10-minute fixes. Such is the demand for graphic cards and memory that businesses can make a killing off consumers unwilling to carrying out the installation themselves. The truth is, you can find out most what you need to know through a simple online search and step-by-step guides like the one below.

So, if you’re a PC gamer and need a hand introducing your new graphics card to the system, observe the following points.

Un-install drivers

First off you’ll want to disable your old graphics card before inserting the new one. Failing to do this will see your computer trying to trace the previous chip when the new one has been inserted into the motherboard. So, after right-clicking on ‘My Computer’, click on the ‘properties’ button before finding ‘Device Manager’ located within the ‘hardware’ tab. Your current card will be found under the ‘Display Adapter’ button so, after you’ve accessed the option, click the name of the card to view its properties, before un-installing it. The removal process should take around five minutes.

Remove

After your card has been fully un-installed, a notification should appear confirming this. Upon viewing the message, shut down and turn your machine off at the mains. Open

Advantages of Cloud Computing for the Home

Cloud computing not only transforms home computing, but the way we work and live. If that sounds overblown, consider how working from home and consuming entertainment have changed over the last few years. And the rise of the ‘Internet of Things’, which will co-ordinate internet connected devices, can make your home life more relaxing and enjoyable.

There are already lots of advantages to embracing cloud computing in your home, whether it’s for work, pleasure or managing your household.

Cloud Storage for the home:
One of the big early selling points of cloud computing has been the availability of cheap, plentiful storage space for photos, videos, work documents and anything else you can think of. Cloud storage providers include UK-based Memstore, along with U.S companies such as Dropbox, Amazon, Apple, Google and Microsoft.

It’s important to check out the various options and not just sign up to the most familiar brand names, as costs can vary depending on the storage needed. And in the UK it’s also well worth considering a UK-based provider like Memset, as they’ll be fully-compliant with UK specific laws and regulations, which isn’t the case for businesses based elsewhere.

Backing up

The Cheapest Computer to Date!

If we look few decades back, it was almost impossible to think of working with a computer without any proficiency of knowledge about it. Gradually, the developers made it simple by making use of a graphical operating system. Now, it is simpler yet by the invention of Raspberry Pi, it has been developed by a charity called Raspberry Pi Foundation. It is not more than the size of a credit card. The feature which makes it genuinely special is its ‘ease of use’, especially for the beginners. Another important factor is its price, which is either $25 or $35, depending upon the version. The price is good news for them who can’t afford to buy a usual desktop.

These days computers are important, as these have become important means for communicating, be it for business purpose or some personal need. We could assure you that Raspberry Pi fulfills the basic needs of all classes of people. However, you need to understand one thing clearly, from the moment you take the circuit out of the package, do not expect things to happen automatically. If you do not know how to work with Raspberry Pi, land up on Raspberry Pi’s

Cool Mouse Operations You Can Use In Windows

Here are five windows operations that you can use on some occasions with windows or associated software.

Open new links in brand new tabs on Windows Internet Explorer

If your mouse has three buttons – then use the middle one to open new tabs. Hover the mouse pointer over the link and press the mouse wheel to open up new tabs.  All you need to do is place the mouse pointer over a link and then press down on the middle mouse button (the mouse wheel).

The middle mouse button is able to roll forward or back, however, it is also able to be pressed down and clicked just like a button.  If you do this on a link then it will open up that link in a new tab.  This is a lot quicker than pressing right-click and clicking on “open in a new tab.”  It is an easier way to research certain items by simply clicking in order to open new tabs.

If you are feeling the super lazy you can hold CTRL and press Tab to scroll through your tabbed windows – or you can even hold Alt and press Tab

Tips How for Choosing the Tablet PC

Tablet devices are not cheap. Even the good starter models are going to cost you over $150. So you have a moral obligation to make sure that you pick one that going to be right for you. It is in your best interest to purchase a tablet device that is going to have a clear return on investment for you in particular. The tablet must provide you with both a short term and long-term use. You must ensure that the return on your investment is more than just having another gadget to fill your drawers at home. This article has seven tips that will help you to pick the right tablet device for you. It features tips that are less about technical specifications, and more about real-world issues revolving around usability, usefulness and practicality.

1 – The feel, shape and grip of the device

This should be the most obvious factor when you are choosing a hand held device, specifically because it is a “Hand held device,” which indicates that it is going to spend a lot of time in your hand.

Why should the feel of the device affect how