Information Technology

Nigeria May Lose N612bn Yearly ICT Revenue Due to Counterfeiting

About $2 billion revenue which yearly enters Nigeria earns from activities at Africa’s largest Information and Communications Technology (ICT) centre, Computer Village, Lagos, is currently under threat from counterfeiters.

Counterfeiting has increased by almost 50 per cent within three years of a major raid by the Standards Organisation of Nigeria (SON) on the market, located at Otigba, in Ikeja.

Besides the churning out of substandard products with their concomitant health and safety hazards, the nefarious activities also bleed the economy as manifest in the low volume of trade and revenue loss.

Former Minister of Communications Technology, Dr. Omobola Johnson, at a Lagos forum shortly before leaving office in 2015, revealed that Computer Village contributes $2 billion (about N612 billion) yearly to the economy, noting that the fund comes mostly from phones and exploration of all sorts of software applications.

Already, the World Bank has hinted that global economies lose between $600 billion and $700 billion yearly to counterfeiters and theft of intellectual property.

The Guardian report indicated that the market for counterfeit and pirated products are of two genres – primary and secondary.

At the primary market, according to a source within the market, consumers purchase counterfeit and pirated products believing they have purchased genuine articles.

According to him, at the secondary market, the buyers duly know what they are purchasing is fake but for the price. Noting that measures to combat the menace at both markets differ, he added: “It is therefore important to know how much of a threat each poses when considering product specific strategies.”

A further investigation showed that some brands of products, especially mobile phone categories like Samsung mobile, LG, Gionee, Wiko, Asus, Lenovo and Tecno suffer counterfeiting by as much as 10 to 25 per cent. Home appliances, including juicers, sandwich makers, iron, electric kettle, blenders, are affected by as much as 20 per cent. Other products prone to faking include printers, cartridges and ink.

Confirming the development, the Managing Director of Tecno Nigeria, Chidi Okonwko, said the brand had been counterfeited by 10 per cent, stressing that some peddlers often display the fake products at different locations within the market precinct.

“It is affecting a whole lot of things in this market. We hear of people complain about Poor Quality of Network Service (QoS), the activities of these phone fakers contribute a lot to that. So, users of such phones are losers, government is losing revenue. We are losing the market too as original equipment manufacturers and this spells doom for the market,” he stated.

A major operator in the market, who simply gave his name as Chukwuemeka Alika, said there had been a rise in the activities of counterfeiters in the market lately, saying except urgent measures were adopted, the problem “may snowball into an open confrontation. It is annoying that you will see a lesser brand of the phone you are selling in the market at a very ridiculous price. Those of us who have spent huge money to import the originals will be at a loss.”

It was also learnt that most of these fake products come mostly from Asia, especially from Vietnam, Thailand, Taiwan and even China. The Nigerian Communications Commission (NCC), in 2015, disclosed that about 250 million substandard phones were being sold yearly in the country.

The Executive Vice Chairman of the commission, Prof. Umar Danbatta, who gave the figure, then in an acting capacity, noted that the damaging impact of the products on the economy could not be quantified in socio-economic terms.

Specifically, he lamented the adverse effects of the products on a broad spectrum of the national life, saying that the ugly development posed grave danger to the environment as well as the health, safety and privacy of the buyers.

“Counterfeiting is a growing economic problem affecting a wide range of products in the ICT sector. Mobile phones are especially targeted with some 250 million counterfeits sold yearly. This number constitutes about 15 to 20 per cent of the global mobile phone market.

“Apart from the obvious negative economic impact of this ugly trend on the manufacturers of genuine products, there are other consequences for operators, government and authorised dealers which include brand evaluation, loss of revenue, copyright and trademark infringement, unfair competition, loss of tax, cost of compliance with applicable national legislation, national security and loss of employment opportunities.

“This menace also poses danger to the health and safety of consumers, equally breaching the privacy of consumers. A collective effort is urgently needed to curtail counterfeiting in ICT.”

Enterprise software: The big trends and why they matter

Last month, made it onto the list of the top 10 enterprise software vendors for the first time. The arrival of a software-as-a-service provider among the elite says a lot about the growing importance of the cloud in business applications.

The cloud may be on the rise, but’s $3.8bn 2013 revenues are dwarfed by Microsoft’s $65.7bn, and by the combined $142.9bn of the top four enterprise software vendors — Microsoft, Oracle, IBM and SAP:

Source: Gartner, April 2014

Although there’s a lot of inertia, the cloud and software-as-a-service (SaaS), along with several other technology trends — including mobile, analytics and consumerisation — are profoundly changing the enterprise software landscape. Here’s how another analyst firm, Forrester Research, sees the forces that are reshaping business applications:

Source: Forrester Research, September 2013


The rise of cloud and the utility model represents one of the biggest shifts in the way enterprise applications are written, deployed and consumed. The global public cloud market will be worth $191bn by 2020, up from $58bn in 2013, with cloud applications accounting for $133bn, according to Forrester.

Cloud technologies can also been seen as the biggest threat to the established order of on-premise software and the vendors that supply it.

“It’s sort of scary for them because you can see a world in which people say, ‘I’m going to buy order management from him, item master management from him, and vendor management from him and I will make it all work together, or the semantic web will’,” Forrester Research VP and principal analyst George Lawrie told ZDNet.

To counter the growing appetite for cloud services, one of the approaches employed by big enterprise software vendors is to work with firms such as Accenture, Deloitte, Infosys and Wipro, according to Lawrie.

“‘Why don’t you host it? Why don’t you provide a kind of managed service — a complete turnkey SAP or Oracle, or whatever it is — and we’ll call that cloud’,” he said.

“If you talk to them, they’ll say, ‘We’ve been doing cloud for ever’. They’ll call that cloud. I’ve just finished working with a company that said, ‘OK, it may not be cloud. But what’s the difference? It walks like a duck, it quacks like a duck — I’m calling it cloud.”

A second approach is the one adopted by SAP with Business ByDesign.

“They’re saying, ‘No, no, no — this is a native cloud application. We’ve written it from the ground up’. But they’ve been really slow to get there,” Lawrie said.

“We’ve had lots of time to prepare for these changes. By the early to mid-2000s the signals were screaming that this change was going to occur, but still companies were unprepared.”
— Simon Wardley, Leading Edge Forum

“It has no inventory at all. At first I thought, why are they doing that? It’s obvious: they don’t want to cannibalise their own business. They’re saying, ‘This is for services businesses. You’re a manufacturing business. That’s much manlier. You’ve got to have stuff on-site and your own IT people’. And actually most manufacturers believe that.”

The shift to the cloud and the utility model has caught out not only vendors, but also big users of on-premise software, according to researcher Simon Wardley at CSC’s Leading Edge Forum.

“When we talk about enterprise IT and all the changes going on, and ERP and CRM becoming more of a commodity provided by utility services, all of these are pretty predictable changes that have been known about for some time,” he said.

“It’s a process from the novel and new, to the uncharted, to the commodity, the utility, the industrialised,” added Wardley.

“That impacts not only vendors that may have inertia barriers in terms of existing business models; it will also impact companies that consume it in various ways — not just in efficiency and the ability of competitors to build things more quickly, but it can also reduce barriers to entry into your business.”

Research into strategy that Wardley conducted among a group of 150 companies highlighted shortcomings in organisations’ grasp of fundamental changes in enterprise software.

“They have great big strategy documents full of how, what and when, and very little why. The why, when you break it down, is normally 60 to 70 percent of what other people are doing — like cloud, big data and social media. They’re acting in that environment without actually understanding the landscape,” he said.

“If you think about why a general bombards a hill, he doesn’t do it because he’s got some consultant’s report saying 60 to 70 percent of generals bombard hills, therefore what you should do is bombard a hill.”

That lack of insight causes obvious issues in the application of technology for things like cloud and the shift from product to utility.

“We’ve had lots of time to prepare for these changes. By the early to mid-2000s the signals were screaming that this change was going to occur, but still companies were unprepared,” Wardley said.

“They had lots of inertia due to past practices, and best practices in the product world, which are of course going to be different to best practice in the utility world. The companies being impacted by this are generally those that have very poor situational awareness. They’re surprised by the highly predictable.”


Utilities such as SaaS have their place, but they are not going to replace everything, according to Ovum principal analyst Roy Illsley.

“It is probably eventually going to get to potentially 60 to 70 percent of the market — that’s what most people expect,” he said.

But SaaS is also having a wider impact, by changing perceptions about the need for tailor-made applications.

“SaaS has revolutionised things because in some aspects companies are quite happy to say, ‘You know what? We need to have that app bespoke. But we can have that as standard, delivered from the cloud. That’s where ServiceNow and Salesforce have made a killing,” said Illsley.

Forrester’s George Lawrie says people are typically buying the non-differentiated systems of record and building systems of interaction.

“Nobody uses the screens that SAP delivers. They always build their own screens, which are optimised for their own experience for their employees.”
—Roy Illsley, Ovum”

“Those systems of interaction are the ones that have the consumerisation and which are differentiated for their audience,” he said.

“For example, nobody uses the screens that SAP delivers. They always build their own screens — always, always, always — which are optimised for their own experience for their employees.”

However, businesses may be fooling themselves about the significance of the customisations they are undertaking, according to CSC’s Simon Wardley.

He once asked how many of a group of 100 CIOs had ERP.

“One hundred hands went up. Then I said, ‘Well, it’s commodity if you’ve all got it’. Two arms shot up saying, ‘No, no, it’s not commodity’ because of their customisations, which made it special,” Wardley said.

“So then I asked them what their customisations were. Of course, they were very secretive but after a bit of badgering they told me one, and so I asked the entire room, ‘Does anybody have this?’. One hundred hands went up.”

This group of CIOs was spending billions of dollars a year in total customising pretty much the same systems in almost identical ways, creating no differential value for any of them, said Wardley.

According to Ovum’s Roy Illsley, the effect of companies like has been to change what businesses focus resources on. They are now often happy to adopt a standard SaaS CRM system, for example.

“It’s the same as everybody else’s. It’s not the tail wagging the dog like it used to be,” he said.

“Of course, you’re going to get slight variations where people want bespoke stuff. But in general it seems in this world everybody else does something like this, so how does that differentiate this company from that company?”

It’s not the app that makes a difference — it’s what a company does that rivals can’t see and copy, Illsley added.

“It’s not that little bit of software that tells us where customers have bought it or their balance is X — every bit of software can do that.”


Nevertheless, the attractions of cloud-based applications may not be enough to trump business inertia, according to CSC’s Simon Wardley.

“Of course existing consumers will often have resistance to that change for numerous reasons — existing processes, political capital. There are often all sorts of inertia barriers,” he said.

“It’s difficult to say, ‘That billion-dollar ERP system I’ve just implemented, I can now buy on a credit card’ without looking foolish.”

According to Wardley, inertia is inevitable among user organisations as well as among vendors.

“People will always try and hold on. They always want the past so that it doesn’t cost anything to change but they want it like the future.”

“Unfortunately, you can’t have volume operations customised for you and commodity provided with non-commodity components. It doesn’t work,” said Wardley.

But there can be sounder business reasons for sticking with traditional on-premise approaches, according to Ovum’s Roy Illsley.

“There are still some traditional companies that use apps running specifically for their business purposes. So if you’re a bank, your core banking applications are almost certainly still running on a mainframe,” he said.

“If a core banking application works, you’re not going to throw it out and spend millions of pounds to replace it with something that you’re not actually sure of.”
— Roy Illsley, Ovum

“They’re a completely different set of apps to the ones if you’re something like a PayPal, which could be construed to be a bank but it hasn’t got all the same baggage that a bank has.”

“They’re probably running things in a slightly different way because they’re a newer organisation that has grown up without having all this legacy stuff that works,” said Illsley.

“If a core banking application works, you’re not going to throw it out and spend millions of pounds to replace it with something that you’re not actually sure of.”

However, elements of ERP are breaking away and being used by businesses as discrete software components.

“Because [companies] do want to be connected but actually, as long as they can collaborate and see and share information, it doesn’t need to be a monolithic great big app that runs absolutely everything,” Illsley said.

“It can be discrete fragmented apps — with the apps being run locally but the data and management being centralised.”

As examples, he cited airlines using iPads to locate passengers without printing out reams of paper, and restaurants using iPads for bookings, orders and recording customer names.

“The mobility aspect is being continually rolled out and that needs the apps to work in a different way because you can’t necessarily have a full ERP capability running on an iPad. You’re going to have to do something slightly differently,” Illsley said.

“That ‘something slightly differently’ may be a front-end app sending information back and then you’ve got the app at the back end that just does the back-end processing, and that’s standard.”

“How you crunch numbers, how you docket the numbers, how you present the numbers in a financial market or how you count stock — that’s standard stuff that heavy-duty processing can do but it’s almost certainly happening in real time now and not anything like batch processing.”


The increasing provision of services to the device is why mobility and the demand for real-time information are two of the big enterprise software trends, along with the cloud.

“The cloud has chipped away, SaaS has chipped away at elements of enterprise apps and you see elements of it twisting off into cloud and SaaS-like operations,” Illsley said.

“But with mobility and real-time applications, what you’re now getting is the demand for the tool to deliver faster to more places the right sort of capability for the people who are using it.”

“You’re not expecting them to come in and sit at a terminal and use green-on-black to fill out a form when taking an order. Those days are well and truly gone.”

The rippling impact of consumerisation has affected not only the nature of mobile applications, but also the whole look and feel of enterprise software, according to Forrester’s George Lawrie.

“If it isn’t shiny and iPady, they don’t want to use it. So that consumerisation has driven another leap in the way the vendors need to provide the applications,” he said.

“You can’t tell people they’ve got to go on a week’s course, and you can’t keep telling them they’re wrong. Which has typically been the way with enterprise systems: if anything doesn’t work, it’s your fault.”
—George Lawrie, Forrester Research

“And it’s much more complicated than it was for client-server because these enterprise applications, as soon as they start to become instantly consumable, you can provide them not just to employees but maybe to your customers’ employees and maybe even to consumers.”

“As soon as you do that, you’ve got to have a different kind of infrastructure because you now don’t know how many hits you are going to get on your system,” said Lawrie.

“First of all you’ve got to componentise those applications. So all that effort that’s gone into making it very monolithic — ‘It’s all got to go into one database or it will never work’ — that all just doesn’t really work anymore. It’s got to be much more subtle.”

Just like consumer applications, business software has to be intuitive and immediately usable.

“You can’t tell people they’ve got to go on a week’s course and you can’t keep telling them they’re wrong. Which has typically been the way with enterprise systems: if anything doesn’t work, it’s your fault,” Lawrie said.

“That’s not really acceptable to people who are used to buying a holiday online or ordering something from Amazon.”

What has also changed in organisations is there’s less time and resources for learning, Ovum’s Roy Illsley said.

“So what they need is for people now to be more general and wear more than one hat. You’ve got people needing to be able to pick up other areas of the business they’re not familiar with to be able to use the tool, and to do that it’s not got to be shrouded in technology and language, and specific processes that they can’t get their head around.”

The underlying trend building up across all applications is that software must not only do its job and be easy to use, but it must also provide collaboration and sharing, and be simple to deploy and support — and, where possible, user self-supporting, Illsley said.

“A lot more investment has been in either how it looks and it feels for the people who use it and what they get out of it, and into making it easier to manage from a back-end perspective, so that the cost of using the app is coming down,” he said.


The drive for software cost reduction and ease of use, both for business consumers of IT and tech staff, is fuelling a drive for greater automation, according to Forrester’s George Lawrie.

“There’s an extent to which this stuff gets automated. We’ve seen that with the virtualisation of servers and networks and storage. You can spin up a new server out of your datacentre without you having to do any manual work at all,” he said.

“All that kind of stuff that operations people used to do — that’s all automated, eventually it’s standardised and then it’s given away. You shouldn’t even be doing it. You should be pooling it with someone else.”

The second aspect of this trend is increasing self-service features for end users.

“All those days when they used to say, ‘Can you do a report for me? No. Do your own report. Here’s the data. Here’s the tool. Have at it,” Lawrie said.

As well as breaking applications into blocks, consumerisation is also leading to changes in business process management and rules, so where rules and processes were once embedded in code, that’s increasingly no longer the case.

“There will be business rules — for discounts, for example — that would never be in the program now. They’d always be in a table that users can maintain easily themselves. “So you want to change the policy? There’s the table. Change it. You want to change the process? We’re going to do the credit check before we ship instead of when we take the order. There you are, you can do it. It’s not an IT thing,” Lawrie said.

That philosophy is also making itself felt in the embedding of analytics into applications, both in mobile and on-premise enterprise software. According to Lawrie, those analytics can involve real heavy lifting, mathematics, computation, and some exploration of correlations.

“Increasingly, the idea is to throw data into a big lake and bring the analytic to the data, not the other way round. When we say analytic we mean, ‘Let’s take all these observations and let’s see what correlates to something else’ — the unknown-unknown problem,” he said.

“With the elastic cloud and with some new software, that means completely new insights for end users and potentially their being able to orchestrate the rules of their transaction systems alongside these systems that are giving them insights. So they can test and learn in a way they never could before.”

Retailers have been doing these types of analytics for years offline, but now they are becoming more common thanks to technologies such as in-memory computing.

“You’re beginning to see people doing this inline. So at any utility or telecoms company, the person who is probably the lowest-paid person — the customer service person — will have in front of them something that estimates your lifetime value to that company,” Lawrie said.

Ovum’s Roy Illsley cites retail as an area that’s increasingly seeing the value of being able to deliver real-time dynamic price updates to stores.

“Traditionally, price updates were something that you run overnight. You produce a great big file and send it out next morning to the store,” he said.

“But because the technology is there and the applications are now being tweaked to enable it, you can now get real-time dynamic price updates out from a central location to all the stores to say, ‘It’s hot weather. Let’s increase ice cream — bash. Increase the beer price — bash.”

Because the apps can take account of external events, decisions can be made faster — and the infrastructure now can support that delivery of information. The app and the infrastructure have come together to enable the real-time delivery.

“If you look at what’s happened in the infrastructure, you’ve gone from the monolithic app running on a physical server in a datacentre with one app per server architecture,” Illsley said.

“The cloud and virtualisation have come along and now you’ve got apps that are infrastructure-aware and that can scale themselves. How infrastructure can be turned into a flexible resource is something the apps have now cottoned on to. That’s a big shift in app development.”

IT Trends for 2017

Like any year before, 2017 will bring its own problems and solutions, shaping up both the way we use and think about technology.
So without further ado, take a dive into the future and check out some of the most exciting tech trends to look forward to in 2017.

Virtual Reality

With forecasts predicting its growth into a $30 billion market as early as 2020, much has been said about the bright future of virtual reality.

Although the technology remained on the verge of mainstream culture throughout most of 2015, things finally started to pick up over the last 12 months – and it seems this time around VR might legitimately reach the masses next year.

VR has come a long way since Google thrusted it closer to popular culture with the release of Cardboard back in 2014 – both in terms of performance and availability. In a window of one year, a number of leading manufacturers launched their own headsets, steadily pushing the technology to mainstream adoption.

While Facebook-owned Oculus kicked off the relay with the release of Oculus Rift in late March, HTC quickly followed up with the launch of its Vive headset in April. In August, Samsung dropped the revamped Gear VR headset alongside the now-discontinued Galaxy Note 7, keeping the momentum going until Sony delivered its long-awaited PlayStation VR companion in October. Closing the cycle, in November Google unveiled its pimped up Cardboard successor Daydream View.

In addition to this, Microsoft, Nintendo and Qualcomm have also expressed ambition to pursue developing hardware for VR, but details and timelines remain hazy.

One of the more pressing concerns with the wider appropriation of VR has always been the scarcity of content and experiences, but with Google, Oculus and Valve opening their own dedicated VR marketplaces this barely presents a hurdle anymore.

Another development to factor in is the recent announcement of the Global Virtual Reality Association which will unite the biggest names on the tech scene – including Google, Oculus, HTC, Sony, Samsung and Acer – to make better content for VR.

With the sheer volume of headset manufacturers and content creators, it’s hardly surprising VR is finally starting to garner the attention of consumers and mainstream media – and you can bet this trend will grow even further over the next year.

Augmented Reality

Meanwhile, augmented reality is also making progress – and the staggering success of Pokémon Go proves the technology has immense potential to influence consumers in engaging and meaningful ways.

In line with business analysts, Apple CEO Tim Cook has consistently voiced out his belief AR has the potential to be bigger than VR and it seems the iPhone-maker is hellbent on getting a piece of the action.

Back in 2015, Cupertino acquired augmented reality developer Metaio, but while the company has since kept quiet about its AR initiatives, numerous reports began surfacing on the Web over the last year.

Earlier in November, Apple was rumored to be working on their own AR glasses in the style of Google Glass. Around the same time, news outlets further speculated the company is prepping an iOS update that will brush up the iPhone 7 with boosted camera capabilities and also introduce a heap of augmented reality features.

Though we’ll have to wait a little longer to see how this pans out, one thing is for sure – you can expect to see a whole lot more AR in 2017.

Autonomous driving

While it would’ve been a terrifying sight a few years back, chances are we’ll be noticing vehicles without drivers more often next year.

As technology continues to evolve, industry titans are gradually venturing into building autonomous vehicles. In fact, the competition in the self-driving market is heating up at exponential rates – and the good results aren’t falling behind.

Since initially introducing its ‘Autopilot‘ feature back in 2015, Tesla has been steadily touching up the autonomous capabilities of its vehicles, demonstrating the vast potential self-driving technologies hold for the future. In fact, CEO Elon Musk has said the car-maker has plans to cram even more self-driving hardware into its future models in hopes of facilitating entirely hands-free rides.

Tesla isn’t the only company experimenting with this technology though.

Google has been running trials with its own autonomous cars in Mountain View, Austin, Kirkland and Phoenix, and while some driving sessions have been less successful than others, its vehicles have clocked over two million miles in the meantime.

Uber is also in on the action. The ride-sharing giant recently acquired self-driving hardware developer Otto and has since successfully put its first fleet of self-driving trucks on the road; and although it wasn’t the first one to do it, the company also ran some real world self-driving tests with its cabs in Pittsburgh.

In addition to this, Apple and BMW are also said to be planning their first forays into self-driving technologies in the near future, but latest reports suggest the Big A might wait a few more years before going all-in with building its own vehicles.

On another front, researchers have also been pushing the envelope, developing new robust systems and algorithms for real-time object detection which could potentially make self-driving vehicles even safer and more reliable. This ought to also help eliminate mishaps like this one and that one.

What else

In midst of all these autonomous cars driving around, another thing to look forward are drone deliveries.

Once a viral publicity stunt, drone delivery might finally be a thing in 2017. Google, Amazon and Domino’s have all been messing around with the technology over the last year, conducting a series of field tests in various locations including the US, the UK and New Zealand. Meanwhile, UPS and Walmart have also been gearing up to begin delivering packets over the air for some time now.

Still, it remains to be seen which company manages to take the technology to large scale. In any case: Having drones drop your pizza from the heavens no longer seems like such an outlandish idea.

Perhaps slightly less exciting, our homes might get much more functional and interactive next year.

Following the less than anticipated success of the Amazon Echo, earlier this year Google unveiled its own Google Home smart speaker to rival the e-commerce giant. Recent rumors further suggest Apple and Samsung might be considering

Taking into account Amazon and Google both leverage their respective Assistant and Alexa artificial intelligence systems to power the smart speakers, the speculation doesn’t seem that far off.

Apple recently opened up Siri to third-party apps and the next logical step would be to integrate it into other devices. Samsung has also been developing its own voice-assistant service with plans to integrate it into future home appliances and wearable devices. So get ready for some frustrating conversations with your home.

One last thing: We’ll probably have to wait a few more years before the wireless evolution, but you can also expect to see a lot less wire next year. In light of Apple axing the headphone jack on the iPhone 7, Samsung is also expected to ditch the standard audio port on the Galaxy S8, slated to arrive early next year.

This should give headphone manufacturers a little more incetive to put the new Bluetooth 5 to good use.

What do you think will be the leading tech trends in 2017? Spur up the discussion and share your opinion down in the comments.

Technology lights the way for quantum computing

Researchers at Tyndall National Institute develop scalable, electrically driven photon sources to drive powerful quantum technologies

Quantum computing is heralded as the next revolution in terms of global computing. Google, Intel and IBM are just some of the big names investing millions currently in the field of quantum computing which will enable faster, more efficient computing required to power the requirements of our future computing needs.
Now a researcher and his team at Tyndall National Institute in Cork have made a ‘quantum leap’ by developing a technical step that could enable the use of quantum computers sooner than expected.
Conventional digital computing uses ‘on-off’ switches, but quantum computing looks to harness quantum state of matters — such as entangled photons of light or multiple states of atoms — to encode information. In theory, this can lead to much faster and more powerful computer processing, but the technology to underpin quantum computing is currently difficult to develop at scale.
Researchers at Tyndall have taken a step forward by making quantum dot light-emitting diodes (LEDs) that can produce entangled photons (whose actions are linked), theoretically enabling their use to encode information in quantum computing.
This is not the first time that LEDs have been made that can produce entangled photons, but the methods and materials described in the new paper have important implications for the future of quantum technologies, explains researcher Dr Emanuele Pelucchi, Head of Epitaxy and Physics of Nanostructures and a member of the Science Foundation Ireland-funded Irish Photonic Integration Centre (IPIC) at Tyndall National Institute in Cork.
“The new development here is that we have engineered a scalable array of electrically driven quantum dots using easily-sourced materials and conventional semiconductor fabrication technologies, and our method allows you to direct the position of these sources of entangled photons,” he says.
“Being able to control the positions of the quantum dots and to build them at scale are key factors to underpin more widespread use of quantum computing technologies as they develop.”
The Tyndall technology uses nanotechnology to electrify arrays of the pyramid-shaped quantum dots so they produce entangled photons. “We exploit intrinsic nanoscale properties of the whole “pyramidal” structure, in particular, an engineered self-assembled vertical quantum wire, which selectively injects current into the vicinity of a quantum dot,” explains Dr Pelucchi.
“The reported results are an important step towards the realisation of integrated quantum photonic circuits designed for quantum information processing tasks, where thousands or more sources would function in unison.”
“It is exciting to see how research at Tyndall continues to break new ground, particularly in relation to this development in quantum computing. The significant breakthrough by Dr Pelucchi advances our understanding of how to harness the opportunity and power of quantum computing and undoubtedly accelerates progress in this field internationally. Photonics innovations by the IPIC team at Tyndall are being commercialised across a number sectors and as a result, we are directly driving global innovation through our investment, talent and research in this area,” said Dr Kieran Drain, CEO at Tyndall National Institute.

Top 10 Technology Trends for 2016

Think of your last 24 hours. Chances are you’ve had several moments of continuous connection with information, apps, services, devices and other people. This “digital mesh” surrounds the individual and new, continuous and ambient experiences will emerge to exploit it.

Our lives are becoming increasingly connected to our devices, other people and a variety of things. Smart machines get smarter, and a new IT reality must evolve with technology architectures and platforms to support the advancement of a digitally connected world.
This year’s top 10 strategic technology trends are grouped into these three complementary trends that are mutually reinforcing with amplified disruptive characteristics.

Trend No. 1: The Device Mesh

The device mesh moves beyond the traditional desktop computer and mobile devices (tablets and smartphones) to encompass the full range of endpoints with which humans might interact. As the device mesh evolves, Gartner expects connection models to expand and greater cooperative interaction between devices to emerge. We will see significant development in wearables and augmented reality, especially in virtual reality.

Trend No. 2: Ambient User Experience

All of our digital interactions can become synchronized into a continuous and ambient digital experience that preserves our experience across traditional boundaries of devices, time and space. The experience blends physical, virtual and electronic environments, and uses real-time contextual information as the ambient environment changes or as the user moves from one place to another.
Organizations will need to consider their customers’ behavior journeys to shift the focus on design from discrete apps to the entire mesh of products and services involved in the user experience.

Trend No. 3: 3D-Printing Materials

We’ll see continued advances in 3D printing with a wide range of materials, including advanced nickel alloys, carbon fiber, glass, conductive ink, electronics, pharmaceuticals and biological materials for practical applications expanding into aerospace, medical, automotive, energy and the military.
Recent advances make it possible to mix multiple materials together with traditional 3D printing in one build. This could be useful for field operations or repairs when a specific tool is required and printed on demand. Biological 3D printing — such as the printing of skin and organs — is progressing from theory to reality; however, politicians and the public don’t have a full understanding of the implications.

Trend No. 4: Information of Everything

Everything surrounding us in the digital mesh is producing, using and communicating with virtually unmeasurable amounts of information. Organizations must learn how to identify what information provides strategic value, how to access data from different sources, and explore how algorithms leverage Information of Everything to fuel new business designs.

Trend No. 5: Advanced Machine Learning

Advanced machine learning is what makes smart machines appear “intelligent” by enabling them to both understand concepts in the environment, and also to learn. Through machine learning a smart machine can change its future behavior. This area is evolving quickly, and organizations must assess how they can apply these technologies to gain competitive advantage.

Trend No. 6: Autonomous Agents and Things

Advanced machine learning gives rise to a spectrum of smart machine implementations — including robots, autonomous vehicles, virtual personal assistants (VPAs) and smart advisors — that act in an autonomous (or at least semiautonomous) manner. This feeds into the ambient user experience in which an autonomous agent becomes the main user interface. Instead of interacting with menus, forms and buttons on a smartphone, the user speaks to an app, which is really an intelligent agent.

Trend No. 7: Adaptive Security Architecture

The complexities of digital business and the algorithmic economy, combined with an emerging “hacker industry,” significantly increase the threat surface for an organization. IT leaders must focus on detecting and responding to threats, as well as more traditional blocking and other measures to prevent attacks.

Trend No. 8: Advanced System Architecture

The digital mesh and smart machines require intense computing architecture demands to make them viable for organizations. They’ll get this added boost from ultra-efficient-neuromorphic architectures. Systems built on graphics processing units (GPUs) and field-programmable gate-arrays (FPGAs) will function more like human brains that are particularly suited to be applied to deep learning and other pattern-matching algorithms that smart machines use. FPGA-based architecture will allow distribution with less power into the tiniest Internet of Things (IoT) endpoints, such as homes, cars, wristwatches and even human beings.

Trend No. 9: Mesh App and Service Architecture

The mesh app and service architecture are what enable delivery of apps and services to the flexible and dynamic environment of the digital mesh. This architecture will serve users’ requirements as they vary over time. It brings together the many information sources, devices, apps, services and microservices into a flexible architecture in which apps extend across multiple endpoint devices and can coordinate with one another to produce a continuous digital experience.

Trend No. 10: Internet of Things Architecture and Platforms

IoT platforms exist behind the mesh app and service architecture. The technologies and standards in the IoT platform form a base set of capabilities for communicating, controlling, managing and securing endpoints in the IoT. The platforms aggregate data from endpoints behind the scenes from an architectural and a technology standpoint to make the IoT a reality.