Ticker

6/recent/ticker-posts

What is the Future of Programming Languages.

 The UK's tech community is buzzing with the promise of AI and new computer languages, as the industry seeks to recruit the next generation of computer scientists and engineers. However, researchers are also exploring the next stage of programming languages and at the Interac and DevCon2 developer conferences in Shanghai this week, the focus was firmly on tomorrow. "The idea is that you would find yourself in a perfect language one day, but there are many open questions to be resolved in terms of compiler technology, software libraries and compiler tool chains," says Eric Watkinson, director of research and technology at Imperial College London. "This would be about what sorts of things you could do with a new language in the hands of thousands of developers." Here's a guide to what's happening, and where the tech industry might be heading, in the world of programming languages in 2020. 


1. Personalisation

 



The chatbot revolution has now really taken off, with consumers demanding more services that feel more human. In the future, this means you could interact with a service via conversation rather than click a button.
 Jia Wang, professor of computer science at the University of Southern California, told The Register he expects people will gradually become less reliant on user interfaces and move towards the conversational approach. "I think in the next few years we will see a lot of these interactions, especially on the mobile front, become more and more user friendly and incorporate a conversational interface," he said. "Your phone will become a personal assistant that helps you carry out activities. "I think the whole conversational approach will continue to rise. Think about Tesla cars where you can call your car and it knows you and gets directions." 


2. Use of in-memory computing 


This is a new form of computer architecture that's becoming increasingly popular. In-memory computing can store data in a device that only needs to be rebooted every so often. Adrian Cockroft, professor of computer science at Trinity College Dublin, told The Register that the growth of in-memory computing will allow for an "extreme reduction in latency". He described the advancement as a key "tipping point". "This is the ability for a program to run in an entirely different way to how it's traditionally been implemented in large software systems," he said. "With flash memory and flash architecture, this should allow us to achieve far higher peak throughput than was previously possible."

 3. Multi-core and heterogeneous computing


 Already common in cloud computing and smartphones, the trend towards multicore processors is moving on to other areas of computing. This is because heterogeneous computing allows computers to carry out tasks in different ways. 

4. Programming language evolution

 
 
Programming languages evolution 



Although there will be "no standardisation" of a new programming language, industry research groups are looking at ways to make new programming languages more "refined" or efficient. The new languages will have more complicated syntax to make them more accessible to non-specialists, Watkinson said. Prof Richard Feldman, founder of Gosling – a long-standing programming language research team based at the University of British Columbia in Canada – told The Register that many areas of the industry are developing programming language research "in a serious way". "As those languages improve, the ability for those languages to be translated into a variety of hardware architectures should improve as well," he said. "That means that if you want to build a better [computer] chip, you can in theory just pick up the language that's developed for that chip." Another benefit is the efficiency of the languages will help new coding technologies, such as block-based programming or functional programming, to take hold. 

5. A new age of hackers


 Watkinson said the blockchain concept could mean a new form of computer malware. "I think we are going to see a lot of hacking with digital currency and there's going to be a lot of people getting in trouble over that, but at the same time there is a good chance that many of them will not," he said. "Blockchain is a kind of cryptography. It's something where it is possible to manipulate data and the developers of that application can then be tricked into believing the data is there when it isn't. The only way to gain control of it is to hack the [blockchain application]." 

6. The rise of quantum computing


 Quantum computing will be extremely powerful because it doesn't need to process everything through a quantum bit – or qubit – which can be encoded as either a one or a zero. Instead, quantum computing is based around one fundamental quantum bit. Eugene Bloch, a physicist and research fellow at the Australian National University in Canberra, said quantum computing is likely to come in the next 10 to 20 years. "You need to look further out than the next 10 years because it may be 20 years before we actually get to a reliable quantum computer," he said. "You might see a kind of smaller scale first and then you will see a more general purpose quantum computer in the next 20 years." 


7. Trusted computing


A trusted computing model that relies on network-level communication would make it possible for people to remotely access computers, such as medical devices, without affecting the integrity of the network. It's possible to use cryptography to provide an additional layer of security in these scenarios. The devices and the network would need to have a standardised way of ensuring trust is maintained. 

8. Market consolidation 


There are currently more than 1,000 software vendors in the computer industry, but they will be squeezed out by bigger groups that will control the most components, Watkinson said. He cited Apple, IBM and Microsoft as examples of companies that might push out smaller competitors to become the largest software company in the world. 


9. Reality on demand 


Augmented and virtual reality (AR and VR) will be a "massive commercial opportunity" but few users will be using them immediately because consumers won't buy the hardware until there is compelling content, Watkinson said. 5G networks will be needed to create AR and VR experiences but the market will need a "better underlying virtual reality technology", he said.


 10. New coding languages.


New programming languages are on the horizon, said Feldman, but it will take five to 10 years for the market to mature before developers are comfortable creating applications for the next generation of computing. 

Post a Comment

0 Comments