Category Archives: Research

Transport Challenge: Deutsche Bahn and Data Pitch ask startups to change the future of transport

German rail company, Deutsche Bahn AG, is making its data available to start-up businesses and SMEs as part of Data Pitch, a new European Commission-funded initiative which is supporting open innovation with data. Deadline for entrants is Oct. 1st.

Should Innovators Reveal How Much They Let Technology Make Creative Choices?

BrainIs it true? Do the most creative people generate ideas straight out of their heads without any outside help? That's what most people would tell you. But the reality is that the best innovators boost their creative output with the help of structured tools like patterns and even technology.

David Pogue wrote a brilliant article in Scientific American titled, "Should Artists Reveal How Much They Let Technology Make Creative Choices?" He cites numerous examples of how artists and entertainers use various types of aides to create their masterpieces. From the article:

Apple's GarageBand program for Mac computers lets you create fully orchestrated “compositions” just by dragging tiles into a grid. Everything sounds great, whether or not you know anything about rhythm, pitch or harmony. At the time of GarageBand's introduction, its product manager told me that even if the program semiautomates the composition process, it still gives people a taste of the real thing. It could inspire a novice to learn music, maybe take up an instrument.

Agreed. But how can we gauge artists' talent without knowing how much of the work was theirs? Should it affect how much we pay for their output? And what about when commercial musicians use GarageBand to produce their tracks—as Oasis and many indie bands have done?

Everyone knows that technology assists almost every creative endeavor these days, from the moment a four-year-old drips paint onto a turntable to make spin art. We also are aware that Hollywood uses computers for its special effects and that most pop songs are Auto-Tuned and pitch-corrected. But in those cases, the audience is in on the fact that machinery has helped out.

It's not the same thing when technology's assistance is concealed from us and is credited to the human. That's why lip-synching at live concerts is still controversial and why athletes are disqualified for secretly using drugs or other performance enhancements. Disclosing when our creative works have come from canned parts isn't just important for intellectual honesty; it would also make a better barometer for the rising tide of robots entering creative fields. (If you hadn't heard, robots are now capable of composing chorales and painting portraits.)

These days even professional musicians, artists and performers can substitute an on/off switch for human talent. Shouldn't the public know which is which?”

David's point about whether the public should know is well-taken. But in the grand scheme things, what matters most is how humans can elevate their creative output. Extensive research has shown structured approaches do more to boost creative output than to limit it. For thousands of years, inventors have embedded five simple patterns into their inventions, usually without knowing it. These patterns are the "DNA" of products that can be extracted and applied to any product or service to create new-to-the-world innovations. Using these patterns is no different than using a human-engineered technology. The technology has within it the wisdom of its creator that is then transferred to others to boost their creativity.

Humans have evolved to create. Stepping on the shoulders of others, be it through a technology or a pattern, is our next evolutionary path.



More Than A Dream: Advanced Technology And Creating A Risk-Free Market

By Carol Ozemhoya, Contributing Editor at Vector

Some people worry about technology costing people jobs and taking over the world as has been portrayed in many major motion pictures, such as the “Matrix” series.

But in reality, a lot of the advances in technology have made our lives easier and safer, and well, cheaper. Consider the cost of color TVs when they first came out and what they cost now… a mere fraction of the earlier models. And the same goes for other electronics, including computers and smart phones.

And that’s not even mentioning the fact that those very same electronic devices are processing information faster than ever.

There’s also advanced technology’s impact on financial markets. Artificial intelligence, for example, can enable a trader, financial analyst or even an ordinary person to predict the volatility of the stock markets and even specific stocks. That comes with advanced technology systems that are designed to monitor social media and news sources.

One such system, designed by, is called Vector… it tracks social media and news coverage of specific trends, companies, stocks, even personalities… returning information that can assist its user in making crucial financial moves.

So, the question becomes… could some of these advancements in processing power drive a “Zero Volatility Point” in financial markets?

Some say yes, others say not so much

First of all, some of the rhetoric on the matter comes from what some refer to as the end of Moore’s Law. This is in reference to the chip in most, if not all, computer processing systems. The big question seems to be… how much longer can the developers go before the chip reaches its limits in terms of size and speed?

Sophie Wilson, designer of the original Acorn Micro-Computer in the 1970s and later developer of the instruction set for ARM’s low-power processors that have come to dominate the mobile device world, has such thoughts, reports Next

“And when Wilson talks about processors and the processor industry, people listen. Wilson’s message is essentially that Moore’s Law, which has been the driving force behind chip development in particular and the computer industry as a whole for five decades, has hit its limits, and that despite the best efforts by chip designers around the world, the staggering gains in processor performance over that time will not be seen again.”

She goes on to say, “that since Intel founder Gordon Moore introduced his prediction in 1965 that the number of transistors in a processor would double every two years – it later was amended to every 18 to 24 months – the IT industry has been on a relentless march to fulfill that prophecy, with significant success. She said that in her lifetime, the performance of computers has increased by a factor of 10,000, due in large part to the continual shrinking of transistors on chips.”

Chip designers can’t keep this up, some believe, because the industry found out about 10 years ago that it couldn’t keep increasing chip performance with faster speeds because the processors get too hot. In fact, Intel’s Pentium Pro neared the point of being as hot as a hot plate. Ouch!

That didn’t really stop chip developers, as they sought out other methods of enhancing performance, such as adding multiple processing cores. That worked for a while, but even adding processing cores is reaching its limits, reports Next Platform.

Here’s the thing: at some point, how fast those financial gurus can receive and process financial reports, world news (such as elections, which can heavily impact financial markets everywhere) and plain old tips and instinct, will level off, and that could mean a volatile market.

But the News Isn’t All That Bad

Advancements in artificial intelligence (AI) and alternative solutions such as quantum computing, protein computing, DNA computing (data storage), logic gates and Nano machines, may provide money managers and researchers with new tools to not only read and interpret data at phenomenal rates… but to render precise predictions of all asset classes at once.

“Quantum computers use qubits to store 0’s and 1’s that are encoded in two distinguishable quantum states, and process them simultaneously,” explains Jo Fletcher, co-founder and chief marketing officer of Vector. “As a result of the aforementioned superposition, quantum computing may facilitate deep learning and data processing capabilities that may far surpass today's standards.”

Sure, having a savvy business head, experience and even instinct will remain important, but a lot of the guesswork could be cut out, as more and more accurate information is processed and better predictions are created through modes such as quantum computing.

“Imagine a world where analysts are no longer responsible for performing analysis on one sector or asset class at a time. But instead, they cover all asset classes at once because of the ability to process all known financial information at hyper speed,” says Anton Gordon, co-founder of

Now factor in artificial intelligence and there is a distinct possibility of a zero-volatility point. This would be a point where all known and previously unknown financial market data could be factored into short term and intermediate term trades.

What Does It All Mean?

Vector’s Anton Gordon explains it like this: “The very nature of how we trade stocks and other assets will change. As the use and application of quantum computing normalizes in the financial services industry, we may reach a ‘zero-volatility point.’ This is a point in which the returns on various asset classes approach the risk-free rate due to market prices that more accurately reflect current future expected information on the underlying assets.” 

In other words, in the future, quantum computing could allow for perfect market information. The possibilities are, quite frankly, staggering.

About Vector
Vector is a natural language processing application that performs information extraction on millions of news stories per day. It provides high value to any quantitative researcher, adding a collaborative-authoring workflow in perfect synergy with the most powerful and unique faceted search in the business. For more information, please visit or

Useful Links
Vector website:
Press kit:
Social Media: @indexervector, LinkedIn@Indexervector, @indexerme, @jofletcher

About Indexer
Indexer is a tech start-up in the artificial intelligence space and has a focus on computer vision and natural language processing technologies.