Skip navigation
All Places > All Things Advanced Analytics Blogs

Official EPIC Press release written by:


In an awards ceremony held during the Teradata PARTNERS Conference, Teradata honored 33 high-performing customers and partnerships with Teradata EPIC Awards for their innovative use of data and analytics in shaping positive business outcomes.


See who walked the Orange carpet...


Teradata EPIC Awards Finalists


To all 2017 EPIC winners and finalists, spread the word of your big win! Get your brag on here.




Related News Links:





Chris Rashilla-Barton is an experienced marketing professional with expertise in global program management from development and planning through implementation, employee engagement and results analysis. Accomplished in developing and managing teaming relationships with internal clients, partners, agencies and vendors. Strategic and creative thinker with effective communications and writing skill.

From self-driving cars to photo recognition, AI is becoming an increasing presence not just in our headlines, but also in our lives. Yet depending on the business problem and data involved, there are challenges to applying AI in an enterprise context.

Enterprise AI is a different game with different rules. Some of the differences, which I’ll cover below, are based on both the kinds of data available in the enterprise and the complexity of the operations in which AI will be used. Here are three implications to consider for using AI in an enterprise context.

artificial intelligence, advanced analytics, Ben MacKenzie, think big analytics

Implication 1: Consider domains where AI has been proven  

AI has had some spectacular successes across a broad range of domains: image recognition, object detection, diagnostic image analysis, autonomous driving, machine translation, sentiment analysis, speech recognition, robotics control, and, of course, Go and chess. Notably, all of these breakthroughs are in domains that humans are quite good at.  This makes sense: deep learning networks are inspired by the architecture of the human brain and, in the case of computer vision, by specific structures within the visual cortex. All of these examples represent problems with a hierarchical structure that is amenable to increasingly abstract representation and understanding of the domain. These domains are also associated with extensive publicly available research, code, and, in many cases, pre-trained models.

On the other hand, the application of AI to domains outside of those listed above is less well developed.  Think of recommender systems, fraud detection, or preventative maintenance models. AI has been applied successfully to each of these domains, but the results are more incremental and the research is much less publically available. In part this reflects the fact that these domains involve closely guarded enterprise data which cannot readily be shared with the broader community and in part the nature of the data itself.

Now, the good news is that many enterprises have problems that involve vision, language or robotics control. Whether it’s computer vision on the factory floor or on inventory management systems, or natural language processing (NLP) for compliance reporting or sentiment analysis, companies can directly leverage an enormous body of research and experience.  For other domains, those lacking established research, pre-trained models, published papers, or notable public success stories, AI should be viewed as part of a continuum with other machine learning and analytical techniques.

Implication 2: AI isn’t magic

Viewing AI as an extension of traditional analytics and machine learning for domains with unproven track records will help organizations avoid ascribing a kind of magic to AI: just feed in enough data and you will get good results. If you have this kind of magical thinking about AI, then drop a rubber duck in a stream and try to get AI to predict where it’s going to end up. You can train that model for the next thousand years and you won’t get good results. Without modelling the individual molecules that make up the stream, the process is fundamentally stochastic; there is nothing AI can do.   

AI isn’t a blanket solution for all of the problems enterprises want to use it for. Just because you’re able to classify images doesn’t mean you’re going to be able to perfectly forecast the amount of soda consumed in the Northwestern US in November.

In assessing the best use cases for AI in your business, look closely at the problem spaces you have available. Do you have any problems in areas for which there is available research?  Do you have problems where you have already been applying machine learning?  These are good candidates for applying AI.

You have probably also heard that AI is very data hungry.  The AI breakthroughs discussed above involved truly massive data sets: millions of images in the case of computer vision models.  It’s impossible to predict exactly how much data you’ll need to make AI successful, but typically the smaller the data set, the more likely you are to be better served with more traditional analytical techniques. Similarly, you have probably also heard that AI and deep learning cut down on the need for manual feature engineering. This is certainly true in the breakthrough domains: computer vision models just look at pixels, NLP models just look at words (or sometimes just characters).  The case with enterprise data is less clear. The data certainly needs to be clean and integrated, categorical features need to be encoded, and time series data needs to be dealt with (sometimes requiring manual feature engineering).  Overall, take the feature engineering claims of AI with a grain of salt for unproven domains and expect to have fairly complex data pipelines.

Implication 3: Try multiple AI experiments to find quick wins

Given the number of unknowns involved with assessing whether AI is right for your use cases or not, it’s better to start by casting a wide net. Apply AI to a larger number of problems — ten, for instance — and see which ones produce the best results. Such an approach means you’re not beating your head against the wall with one problem, forcing a square peg into a round hole, when AI isn’t the right approach for that particular issue. Additionally, with this strategy, you can ensure that you get some results relatively quickly. You’ll then have the patience to create the right AI-based models for use cases that might take longer (and the latitude to rule out AI where it isn’t the right solution to the problem at hand).  

Implementing AI involves many considerations. Many companies have never put machine learning models into production, and so jumping into the deep end with AI will mean they’ll soon find themselves in over their heads. It’s not that AI is more difficult than other machine learning, but deploying, monitoring, versioning, and tracking the performance of models is complicated, and if companies do not have experience with it, their AI implementation will not be smooth. A deliberate approach, where AI is applied gradually to a number of use cases, is a way to improve the transition.


benBen MacKenzie, based in Ottawa Canada, has been a Principal Architect with Think Big Analytics, A Teradata Company, for the last 6 years, and has served as the Global Engineering Lead for the last two.    Ben has been focused on building scalable open-source based analytics solutions across a variety of industries and in his capacity as Engineering lead has helped help align Think Big and customers around a complex technology landscape.  Ben has an extensive background in AI and is excited to be part of the current deep learning inspired AI renaissance in his new role as Directory for AI Engineering. In addition to strong engineering and analytical skills, Ben has a proven track record of employing cutting edge research from the deep learning community to build customer solutions.

Written by

I recently read that to manage customer journeys, those journeys must be defined and accessible. Doesn’t this sound simple and awesome? The problem is that it does not reflect the reality in which most large organizations operate.

Journey management and journey analytics remind me of the chicken and the egg – which comes first? There are journeys you inherently need to manage – onboarding, upgrades, and complaint resolution, for example – and analytics are necessary to monitor and improve those. But there are also important customer journeys and paths that you only discover through analytics and exploration. Are transitions between specific support channels proving to be critical, for example? Or do I just ignore this question if the journey was not previously defined for me?

Hopefully, these few examples demonstrate that journey management and analytics must go hand-in-hand. One of the biggest challenges to-date of enabling this has been getting your data into one place. In some cases, portions of data from a single channel might live in multiple stores. In other cases, data from a single channel may live in isolation in a single repository. And in some cases, bits and pieces of data from multiple channels are transformed or duplicated across multiple stores to the point where their lineage is almost impossible to ascertain.

Thankfully, Teradata has recognized this challenge as an opportunity for innovation.

With recent announcements around the Teradata Analytics Platform and Teradata Everywhere, companies have access to a multitude of analytics tools that can be deployed and used to analyze data in a variety of data stores, on-premises or in the most popular enterprise clouds.

Ultimately, organizations still must agree on which journeys they should actively manage. But the hurdle of getting data into a single repository is no longer there. And while enterprises can now manage and analyze critical journeys by dissecting data from a variety of channels and stores, they have the freedom and flexibility to explore new scenarios and new channels. As most journey pros will tell you, the job of managing customer journeys is continuous. There are always new interactions to investigate and new channels through which to operate.

While the job is never done, at least some critical limitations are being lifted. With Teradata, you can manage the journeys you understand, while continuing to evolve those journeys and define new journeys at the same time.

Are you interested in exploring your customer journeys – defined or not? If so, check out our Path Analysis Guided Analytics Interface, which runs on Teradata, Aster, and the Teradata Analytics Platform.

ryan-garrett-headshotRyan Garrett is senior business development manager for Think Big Analytics, a Teradata company. His goal is to help organizations derive value from data by making advanced analytics more accessible, repeatable and consumable. He has a decade of experience in big data at companies large and small, an MBA from Boston University and a bachelor’s degree in journalism from the University of Kentucky.

Filter Blog