Browsed by
Category: Big Data Applications

The Big Gap with Big Data

The Big Gap with Big Data

For the record, I’m only referring to it as ‘big data’ because it makes the title of this article more catchy. We all know that I’m of the camp that marketers need to think small about data.

Anyone who has read much of what I’ve written about the concept of big data knows that I am firmly of the belief that in order to popularize the use of data, we need to think in less grandiose terms. That said, big data can be called as much because of its big applications. Among those big data applications are the creative elements spawned from analysis. And therein lies our big gap; generally speaking (in my experience anyway) creatives tend to be creative, and statistics (and the secret powers they hold) tend to fall on the shoulders of others. Considering the malleable nature and hugely valuable potential that comes with effective data analysis, that creates a pretty serious problem.

Big Data Applications

The Problem (The Big Gap)

So what exactly is this gap? As noted, historically, the creative process has not involved data at this scale. Of course, market research and historical results have always factored heavily into the creation of new creative, but what we’re talking about in this case is something far more focused that can be measured in real time.

With the advent of new media and the facility with which marketers can measure the results of a campaign (and the responses of a targeted audience) data, even superficial data, need to be a consideration of even the most creatively-driven marketers. And that highlights the problem – or our industry’s big data gap – which marketers now face. Data proficiency and insight analysis are no longer neat skills that can be highlighted on your LinkedIn profile; a successful campaign needs creative minds that can read and adapt to real-time analytics in order to ensure every ounce of potential is extracted from an audience and the subsequent creative.

Alas, there has been a divide between numbers people and creative people since the dawn of marketing. So what is needed in order to start shrinking this gap and blend these two, crucial worlds into one?

The Solution

Luckily for us, new media advertising networks (like Facebook, Twitter, and LinkedIn) have been working hard to bring these two worlds closer. Data is now a central focus in advertising dashboards, and it allows for creatives to analyze their work (from a statistical standpoint) run tests with different variables in play and come to numerically-justifiable conclusions that help improve campaigns (as well as universal branding initiatives) in the long run.

Facebook IQ, a branch of the social networking giant that conducts research, shares data and provides marketers with expert insights, recently published some findings as they related to the matter of closing this ‘creative loop’, as they called it. While the points that these marketers analyzed seem somewhat superficial, or even trivial, the process of testing the minutiae of a campaign is, for some reason, so often overlooked. There were two types of analyses reviewed in this particular publication by Facebook IQ: retrospective analysis and in-market analysis. And what exactly is the difference? Looking at subtleties in past campaigns (retrospective), and analyzing performance on a number of these elements in real time (in-market). Again, we’re talking about small adjustments (like re-wording a call-to-action, or changing the color of your creative’s background) that can lead to changes in your audience’s response in a controlled environment (keeping the targeted audience constant).

Closing the Loop and Big Data Applications for Creative

Image Credit: Facebook IQ.

The key thing to remember is that not all brands (and not all audiences) are created equal. Much like my feelings about the term ‘big data’, I have made my feelings about the field of aggregate data very clear: blindly following industry averages to craft your strategies – especially with so much of your own data available – is hugely misguided. You need to pay attention to your own creative and your own analytics in order to determine the best course of action, not simply use the call-to-action ‘Read More’ because some study found that it performed XX% better after studying 1,000 posts. Those studies were impressive at the incipience of social media marketing, now they’re dated and irrelevant.

Customizing Your Gap Closure

As noted, not all brands are created equal. There are test and research papers that you’ll read and review that might sound intriguing, but you need to remember that your test grounds might be very different. What’s more, creatives and dataheads will need to approach the bridging of this gap in a different way. First, let’s take a look at creatives.

The creative marketer will need to first become familiar with the analytics dashboards that exist. Look beyond the superficial metrics like click-through rate and start studying elements like visitor path and engagement value. Just because your content got lots of views and a high volume of ‘Likes’ it doesn’t necessarily mean that you’ve found the Holy Grail. Identify the metrics that matter to your brand and reverse engineer the successful trajectories. Then begin to identify correlations between your most successful pieces of creative in order to run the above-mentioned tests. The opposite is true for the data-inclined.

If your focus is largely on data, then the reverse engineered approach described above is something that is already far too familiar to you. Instead, get to know the creative. For numbers people, idiosyncrasies is the name of the game. Become more familiar with the inner workings of your creative in order to better understand what makes your target audience tick. Don’t simply look at the obvious, think about your creative on a more primal level; look at colors, placement and, one of my favorite emerging fields (from a tech standpoint, anyway) focus on where the audience’s attention and eyes are drawn.

By following these two approaches and learning to empathize with the other side of the equation, the gap can be shrunk significantly. With fewer misunderstandings between these two types of marketers, campaigns can be developed that are more targeted, more adaptive and, ultimately, more successful.

Skype Translator: A New Rosetta Stone

Skype Translator: A New Rosetta Stone

Thanks to deep neural networks, global, seamless communication isn’t far off.

In 196 BC, the Decree of Memphis was issued by Ptolemy V in Egypt. It was written on a granodiorite stele (a big, flat, upright stone) in three scripts. The first in Ancient Egyptian hieroglyphs, then in a Demotic script and finally in Ancient Greek. It proved to be the key to understanding the hieroglyphs thanks to its three identical texts; we know it as the Rosetta Stone.

Thanks to Microsoft, we’re on our way to a universal translator that will make communicating in several (and, presumably, eventually all) languages seamless. Effectively, Skype Translator is the Rosetta Stone of the technological era.

The Story

I’ve written quite a bit about my amazement at the world of machine and deep learning. This is an example of deep learning in action. Conceptually, a universal translator has been around for decades (long before modern computers came into existence).

The first patents filed for this kind of technology date to the 1930s. And while so many have tried, most have resulted in failure, while the best saw minor successes (if any).

Now, Microsoft (Skype) is aiming to be the first to perfect the concept that had once been a thing of science fiction.

How It Works

I won’t pretend to be a brilliant engineer behind the technology. But I can explain how, in the most simplistic of terms, this works.

For those of you unfamiliar with the concept of machine learning, it is more or less exactly what it sounds like. The complex systems and algorithms that make up an application are somewhat intuitive, adapting to nuances, frequencies and, in the case of natural language processing, things like colloquialisms, to become more accurate. Essentially, the machine evolves as it is exposed to more and more data, helping it become sharper in its results.

In an interview with Time magazine, Lane Schwartz, a linguistic professor at the University of Illinois explained, “The more data you have, the better you’re going to do.” It’s that simple.

What Microsoft has done (according to its engineers and product teams) is perfected the first form of the universal translator that will use what it learns in beta testing to become smarter, more intuitive and faster.

Ultimately, when one person speaks in their preferred language to another person with a different language setting, the system will (almost instantaneously) register what was said and (with grammatical accuracy) regurgitate the phrase in the other user’s language.

Pretty impressive stuff.

The Applications

If this works as well as demonstrations have shown and Microsoft has promised, it will (quite literally) unite the world one front.

We often hear the jokes about fairly trivial apps ‘changing the world’. Well, this actually would. On the business front, this would be a first step in the convergence of several currently detached economies. I say first step because, while communication and those items lost in translation matter, there are still major social hurdles to overcome (in the way different societies do business).

The ability to provide aid and assistance from a distance or during an emergency is also greatly facilitated with this kind of an application. Imagine someone calls you frantic in a language you don’t understand, but your universal translator is capable of translating it so that you can understand it instantaneously.

And those are just a few of the amazing things that can be done.

It’s exciting to think about how this will make the world smaller (yet again).


6 Tech Trends for Which to be Thankful

6 Tech Trends for Which to be Thankful

As we approach Thanksgiving, here are a few tech trends for which all members of the digital space should be thankful this year.

In the world of tech, things move at the speed of ideas. (I think I’ll start using that more often.) So, every year, there are a lot of new developments that shake up and excited the market. Here are a few tech trends that we can all be thankful for this year.

Internet of Things

If you visit my blog often enough, then you’ll know that I am completely enamoured by the Internet of Things. In fact, I recently wrote an article about why the Internet of Things is the greatest concept ever. Well, it seems as though I am not alone in this thinking.

LittleBits Internet of Things Tech Trends

Recently, there has been a wave of companies introducing IoT starter kits. This comes after the wave of companies (perhaps the most well-known of which was Nest) introducing IoT products. The starter kit allows for your average household item to become a smart item. Companies like IBM, TinkerForge and LittleBits are all breaking into the space, and it shouldn’t be long before we see the dream of a smart house made readily available to people from all walks of life.


So, Google Glass might not be the cyborg-esque wearable we all hoped it would be (yet) but that hasn’t stopped the world of wearables from really taking off and essentially creating a brand new, high-demand marketplace.

The market for wearables is on the rise and there doesn’t seem to be a ceiling with regards to where it can go. Theoretically, everything can eventually be smart. Now we have watches, glasses, bracelets, accessories and there is certainly a trend indicating that this is just the beginning. I don’t think I’m alone in waiting hopeful that one day there is a thermometer in my clothes that automatically warms them up for me in the morning on a cold winter’s day. #BillionDollarIdea

Improved AR

While I didn’t mention this trend in my recently published article about trend I noticed at ad:tech in New York, it is definitely something worth getting excited about.

AR (augmented reality) has long been something that fascinated us. When it was first introduced on camera phones (providing us with detailed information about location towards which we point our phone’s camera) it seemed like something out of the future. Now that future is here and AR is not only becoming more commonplace, but more malleable, particularly as technologies like graphene become more widely used.

Smart Accessories (Not Wearables)

We have wearables, so why would we need to include smart accessories? Doesn’t that fall into the category of wearables? Not exactly. These accessories are a little larger.

Trunkster Smart Luggage Tech Trends

The one example that comes to mind is luggage. It might not sound like a field ripe for disruption, but there are groups out there trying to do just that. One such example is Trunkster. This awesome Kickstarter campaign touts a piece of smart luggage that not only goes zipperless, but also features a USB charger for your electronics and, perhaps most impressive, a GPS tracker that links to your phone so you always know where your luggage is (even if the airline doesn’t).


Beacons were first introduced before their time (if that makes sense). People were not ready for personalized and proximity-based marketing initiatives, but today they are.

The concept of beacons is a fairly simple bit of technology: essentially, beacons are designed to identify devices and, for example, push notifications to these devices based on their proximity. So, when I walk into a store and I have never registered my device with the store, a beacon might register my new device and send me a notification of a discount for a first purchase. Slowly but surely, beacons are making their way into the market as the tech world and physical market collide in grand, new ways.

Natural Language Processing

Anyone who knows me knows that I generally base my critiques of social listening and analysis softwares based on their natural language processing capabilities. Well, it seems to be an important criteria for the market, because it is something that is getting a lot of attention.

For marketers, natural language processing is something that can be particularly difficult to deal with, especially when it comes to things like social listening. People tend to adopt a very colloquial speech when sharing content on social media (of course). For marketers, this can pose a bit of a problem when trying to determine industry or target audience pain points or preferences. Luckily, plenty of software providers are starting to make big strides in the way of natural language processing, and I can’t wait to see where that takes the industry.


At this time of year, we give thanks for a lot of different things, and for marketers (and plenty of consumers) these are just a few of the exciting tech trends we can be thankful for.

Now, I would like to wish everyone a very happy Thanksgiving, and lots of luck with Black Friday!

Why We Need to Think Small About Big Data

Why We Need to Think Small About Big Data

Big Data is huge (literally and in terms of its popularity) but to really see its value, we need to think about it at the micro level.

In January, two Princeton University students decided to apply virus propagation modelling to Facebook’s growth to project its future. Based on their results – and the research methods that relied on big amounts of historical data related to the rates of growth and decline of disease spreading within society – they were able to conclude that Facebook’s membership would decrease by 80% by 2017 (maybe because of some Facebook vaccine?). Now, we should keep in mind that this study was not peer-reviewed and was more done as a way to shock the world than to relay a point. What wasn’t accounted for was the fact that Facebook has reached a critical mass. Long story short: Facebook isn’t going anywhere.

The Facebook Virus Model

What we are supposed to expect from Facebook in the next few years based on Virus Propagation Modelling.

A few years ago, during the incipience of the concept of ‘big data’, Google decided to use the data they had from search histories to predict when flu season would arrive. (Why all this focus on sickness and disease when it comes to data?) Though the initial experiment was a success, Google has seen more missteps than triumphs in recent years when it comes to predicting the next plague.

Big data is exciting (or sickening depending on how you interpret its aforementioned applications). Recently, a phenomenal op-ed piece written by Gary Marcus and Ernest Davis (which I would implore everyone here to read) was published in the New York Times. The article focused on major issues plaguing the application of big data to social modelling. (Much like the issues described above.) Right now, there is so much hype around the concept of big data, that people are forgetting that it needs to be looked at on a small scale. Leave the mass modelling to governments and think tanks. Businesses should be looking at their own internal version of ‘big data’ (relatively speaking) and see how it can aid in achieving operational efficiency.

What exactly is ‘Big Data’?

Big data is, for the most part, exactly what it sounds like; it is large quantities of data with virtually limitless applications. Data sets collected are often (and by definition) so large that traditional methods of computing become virtually impossible, hence the advent of so many analysis tools in the marketplace today. In the last two years, we (humans) have created 90% of the information that exists in the world today (statistic courtesy of IBM). Every day, we are creating more, and it is both overwhelming and exciting.

Think Small About Big Data

Photo Credit: Shutterstock. Used under license.

But as with many new applications, we are more excited by the prospect of what it can do on a large scale (and rightfully so) than how it can be applied to smaller applications. The irony is that in this case, at least at present, the real value in big data is thinking small.

Just how small are we talking?

Well, Google was trying to leverage big data in order to predict sickness in different geographical locations. That’s a big use of big data. A smaller use would be finding correlations in engagement spikes and optimizing a content strategy with the information you can pull from these reports.

Every day, businesses are handed a new set of big data from every one of their digital components. Whether it is your Facebook page, a Twitter ad campaign or your website’s analytics, there is a new trove of data waiting to be combed for valuable gems.

What in the world am I talking about, you ask?

This is what in the world I am talking about.

Say I export my data from Facebook once every week. At the end of the month, I have thousands of data points that can be analyzed. Don’t believe me? Log in to your Facebook page and export your most recent month’s page level data. Every day there are dozens of data points that you can study. Even more so when you look at your page level data and you can double that if you run ads on Facebook. So when I say thousands, I mean thousands.

Say you run a regression (sorry stats-haters) on your data in order to identify outliers (both positive and negative) as well as influence points in your data. Then, looking closely at those outliers and influence data you identify correlations that exist. Maybe at every one of these points, you noticed that an image was shared as opposed to text. Maybe 95% of these posts have fewer than 40 words. Maybe more than three-quarters feature a link or a question. This is all extremely valuable when optimizing your brand’s content strategy.

Why am I emphasizing your brand? Too often, we get bogged down in aggregate data. What’s that? It’s people who look at five million Facebook posts, note that images have higher engagement and tell brands that they need to share images and nothing but images. That might have been true for the majority of posts that were examined by that one analyst, but it might not work with your audience. Using your own version of big data to run these same (fairly simple) reports can add a huge competitive advantage when it comes to sharing content that resonates with your audience.

What if I don’t understand all of this?

The beauty of these kinds of programs is that they have been designed to appeal to the masses. It would be foolish for a company like Facebook or Google to nix the idea of targeting the statistically-inept. After all, that would make up the majority of us. Basic analyses of your data are everywhere. Tools exist to examine and analyze top-level data and even some of your free insights on those same networks provide you with information that can be extremely valuable in crafting these strategies.


You don’t need to be a statistician to use big data. When you see the potential with regards to the applications of big data, you can understand why there is such excitement surrounding it. Imagine the possibilities that can be unearthed by looking at these data in new ways.

At the end of Marcus’ and Davis’ piece, they conclude that there is reason to be excited about big data, but that there were more marked breakthroughs in the 19th and 20th century in the form of medicine and other major societal advancements. Of course we can’t dispute that these discoveries changed everything about human life. But big data is just getting started. We have no idea where it can take the human race (both metaphorically and physically) and what the future holds.

All we can say for certain is that when it comes to business, the applications are clear and extremely useful.