Most people in our game cringe when you mention "affiliate marketing". But despite its bad rap, I've had only positive experiences to date. The first: on a couple of content websites I built and eventually sold. While a majority of revenue was generated through advertising, affiliate links to merchandise generated a passive, ongoing source of income. Even with relatively big traffic numbers the income was small, but incremental and not bad for a student who had no idea what he was doing. The second was directly influenced by the first. After selling one of the websites, I was discussing with a Creative how the site made money. Less than a month later, he came up with Homepage For The Homeless, a simple but brilliant idea to turn your everyday online shopping into a charity donation. Since then I've always wanted to explore further, especially understanding if communities can be built and leveraged while maintaining transparency. And if affiliate marketing is sustainable on higher priced items. With that in mind, this post is about to become my third experience with affiliate marketing. That's why I signed up to FlexOffers, an affiliate service who combine a number of various networks into one program, making it easy for newbies like me. And yes, that link is a referral one. And they sponsored this post too. So here's the question - does paying the bills hurt my credibility, even with transparency?...

In 2013 I published a list of ideas I wasn't going to do anything with. And did it again in 2014. I believe the expire date on an idea from 'light bulb moment' to action is about a year. If you haven't actioned something after 12 months, you should open-source it - put it out in the world for someone else to do something with.But it's not true for just the ones sitting there collecting dust - people hoard ideas even as they work on them. Sharing exposes them to theft, they think. Or (worse,) criticism.I have never experienced this to be the case - in fact every time I've shared an idea it's been made better.A year ago I had a beer with a mate and told him about an idea, which included the clichéd napkin sketch. Like most ideas, it was fleeting and I didn't do anything with it. Until recently, when we had another beer and he walked me through how it had stuck with him, and he made some renders of his interpretation. Suddenly we now have a viable product and are hoping to bring it to market soon.When you share ideas with people, they get better. Maybe they'll help you bring it to life. Or introduce you to someone who can. Or make a suggestion. Or just give you some good old fashioned feedback.Hackerspaces are built on this premise. They're not just garages with tools, instead a community of knowledge which thrives through sharing.More than anything though, sharing creates commitment. Having an idea is the easy part, the tough bit is making it tangible. While technology has made making ideas happen easier than ever, each time you share something you expose yourself just a little. And in doing so you give yourself another reason to bring it to life.Don't be selfish with your ideas. They'll be better because of it, and you'll probably make more of them happen. I'll put my money where my mouth is too - if you're interested in the side project I'm working on above, get in touch and I'll share....

I've always been critical of digital gimmicks. Ideas like pizza boxes that turn into projectors or Coke packages that fold into a VR headsets. They are concepts that get a lot of views on YouTube but normal people don't actually use. They also make really good scam entries into awards. Maybe I've been drinking too much Kool-Aid, and despite my opening paragraph - my skepticism is softening. Because there is value in the gimmick. Being first to do something builds a perception of innovation and agility. It doesn't matter if the end user doesn't use it, because the value of such concepts is the PRability of the video. We see this in a trend toward "consumer-facing case studies". Not only do ideas like these build a perception of innovation but can fuel a culture of innovation as well. Fresh gimmicks breed thinking that can be truly effective. No one really wants to order a pizza with an emoji. But Domino's use and balance these concepts with proper innovation like GPS driver tracking and profit-sharing crowd-sourcing (stuff real people use that drive sales and loyalty). One approach to innovation, and a much cheaper one, is simply getting there first. It won't disrupt an industry, but it's not intended to. As long as your video doesn't get caught up in the circle jerk that is the marketing tech industry and actually reaches some of your target audience. Ideally in the right country too....

I spend more time discussing hashtags than I'd care to admit. This is mostly due to a general misunderstanding about what they're used for (clustering and participation) and a too-generous assumption that people give a shit about brands (they don't). Hashtags allow the grouping of like-minded conversations. This can look like many things; from real-time commentary during Game of Thrones to streams of people tagging their holidays photos with the destination. At their most powerful, hashtags are anchors that allow movements to build. With rare exceptions, brands do not play in this space. At least, not as the creator of such hashtags. If you're still not convinced: Real people do not use branded hashtags Putting a hashtag on something doesn't make it more shareable Measurement should not be the only reason something exists (and is not user-first) Turning your brand line into a hashtag for the sake of it is lame Turning your campaign line into a hashtag is even lamer As always, there are exceptions. But when someone asks if you need a hashtag, the answer is probably not....

From a young advertising age you're taught about the Unique Selling Proposition. It's the most important line on a brief - the single thing to communicate to the consumer. By definition, the first word suggests the proposition must be something no other brand or product could own or use. But we're seeing this notion increasingly become redundant. Rightly or wrongly. I attended a session at SxSW called How Norton Hacked Hollywood, a case study about the antivirus software brand releasing a 20 minute documentary on cyber crime. The discussion included the client, creative agency, director and film distributor. I rate it, although I haven't watched it (which says a lot actually). It's definitely more interesting than what their competitors are doing and I totally dig they bought on a film distributor instead of getting the intern to 'seed' it. With increasing investment into content strategy, we'll see many more brands explore this approach. I wonder though, could McAfee (Norton's competitor) have done this? And does that even matter? If a beer brand creates a poker app, does it matter any number of competitor brands could have done the same - beer, alcohol or other? Does your answer change if the target audience regularly uses it? We see this lack of uniqueness in traditional channels too. TV commercials often do a category job, and occasionally not even that. The strategy behind most Superbowl ads is to make the audience laugh and slap a logo on the end. Byron Sharp says distinctiveness is critical in making brands identifiable, which may be a better interpretation of "unique". To answer my own question, perhaps the proposition doesn't have to be something only your brand could own, but rather your brand is first to own. One of my favourite content pieces ever created is Gatorade Replay. Does it matter Powerade could have made it? Stealing from a meme, the above image (which replaces the word art with advertising) suggests an appropriate response could be "Yeah but Powerade didn't." Being first might just be more important than being unique. Both in owning your proposition and what you create. Especially in categories where there is little difference in the product benefit. Or, is that the difference between good advertising and great?...

A depressing glimpse into the future. Virtual Reality is so hot right now. Facebook, Samsung and Sony have all announced consumer headsets to drop in the next few months and it was all anyone could talk about at SxSW this year. Presenters spoke to its potential, exhibitors showcased new hardware and software, and every brand used it as part of their activation. (You could ride a virtual roller coaster with Samsung, race a virtual bike with IBM or paint virtual 3D art with McDonald's.) But here's a few potential problems I've noticed: The screen resolution still sucks It's nauseating (called simulator-sickness) Blinded users were running into walls or getting tangled in cables The headsets are awkward and heavy (well at least on my big head) It's hard to take a selfie when you're wearing a headset (actual problem for social norming) People have greasy foreheads and noses (hurts shareability) No doubt, VR will have many uses. It's fairly disruptive (for want of a better word) when it comes to activations. Education feels like a rich territory. And of course, as with most tech, the best uses are often not discovered until they land in the hands of smart consumers. But VR will only take off if there's a headset in every home. And the big players are banking on this (given their billion dollar investments). They think the answer is gaming, and have built their penetration strategies accordingly; Facebook's Oculus Rift in partnership with Xbox, Sony's VR through Playstation and Samsung's Gear through the Galaxy mobile. Here's the thing, the bullet points listed above can all likely be addressed in future generations. The fundamental problem is not with the tech, but the experience. VR gaming is novelty, and will quickly grow old. Like the short-lived bell curve graph of Nintendo Wii's sales, VR is a fad. Gamers do not need or want a more immersive experience. I suspect it's the same for VR porn (although I've not yet tried it). Of course that doesn't mean you shouldn't explore this space. When Coke changes is packaging to create cardboard VR headsets, they don't need anyone to actually use it. The campaign video is more important than the product, and by the time it hits market, PR has already done what it was meant to. Gimmicks like this don't necessarily need adoption to be successful (especially not for award case studies). Long-term, I'm pessimistic when it comes to consumer adoption. It's rather easy for someone to call you out when you blog a prediction, but if I had to put money on it I'd say after a burst of sales from early adopters, numbers will crash and burn. And in 18 months there's going to be a rather large supply of second-hand VR headsets collecting dust. While I'm being skeptical, anyone else hate 360 degree video?...

Comment threads are very unpopular right now. Their usefulness doesn't scale and they're abused by trolls. It's become easier for publishers to disable them rather than deal with the headache. But the comments aren't the problem - it's people. My Grandma Nola has written letters to Editors since long before I was born. She has three scrapbooks full of newspaper clippings - some so old they've almost completely faded. Of them all, the following letter published in the Herald Sun in June 1991 is her favourite. As you read through, you may recognise some familiar behaviour. (If you're not from Melbourne, Brighton is a very affluent suburb. Preston is not, although it's well on its way to being gentrified these days.) Nola's letter is published, kick-starting the furore. (I'm not sure what a "three-numbered address" is meant to imply.) On the same day, not knowing what was coming, the cartoonist includes this piece next to it. It doesn't take long for someone to bite back (well 14 days seems fast for 1991). Alongside P. Piper's reply the cartoonist jumps in again. Decent satire, albeit not entirely relevant. Nola defends herself. But in the comment game logic doesn't get you anywhere. Disguised as the moral-high-ground, a troll attempts to wind both parties up. Well off topic, but commenter Alan can't pass up a chance to make fun of the wealthy. The comments start coming in from both sides. Interestingly, it's later revealed that Wayne is in fact my father, Nola's son, dubiously using the option of anonymity. On the other side of the fence, Rat Packer makes it personal and challenges the OP's (Original Poster) integrity. Gang mentality kicks in. Among the banter, a "taxpayer" looks down on Alan's comment. Of course, completely off topic by this point. Again under the guise of moral-high-ground, this time Diana struggles to hide her snobbery and xenophobia. By this point in the thread there's always one reader who's clearly not followed everything. K.S. reposts someone else's point. Henry whinges about whinging. The cartoonist returns. Posted too late and buried right at the bottom, the most rational comment of all. And finally, reminding everyone there's more to life, the Editor closes the comments. Months later, the now-infamous Dead Rat of Brighton has one last moment in the sun. Other than just being amusing, it's an interesting observation of human behaviour. We see 25 years ago the same problems we see today, long before comment threads or even internet. The only difference is now it just happens faster. And perhaps with a few more references to Nazis....

I'm surprised there's not more debate about content ownership. Not the usual "you wouldn't steal a car" torrenting conversation, but one about brand pirates who steal content.If you're not sure what I'm talking about, you need only browse your Facebook feed to see global corporations, publishers, small businesses, non-profits and your grandma all publishing content they don't own.Stealing content has become standard practice. But you can't really blame the users - rights ownership and brands 'sourcing' content for social has always been a gray area. Is a press image public domain? Who owns a meme? Can we publish someone's user-generated content if they've used our hashtag? If you credit someone you can use their content, right?Channels, for the most part, address this reasonably well. Google rewards original content. YouTube automatically detects stolen content and removes it (or more cleverly redirects ad revenue to the appropriate owner). Twitter even deletes tweets with stolen jokes if they aren't attributed to the original author.The problem is Facebook, who do almost nothing to protect content ownership, and in doing so only encourage a behaviour of stealing. So much so that whole businesses are built around it (how much content do you think the Lad Bible actually creates?).To be fair, Facebook has kept up an appearance of giving a shit. Their Terms of Service explicitly forbid it, and they somewhat famously deleted the original Cool Hunter page with 780,000 fans for ongoing breaches. They even suggest that images, videos and links that the algorithm has never seen before will be viewed more favourably and therefore given more organic reach.But Facebook's not really trying to curb brand piracy. Users, brands and publishers are all still allowed and often rewarded for publishing stolen content. Even in their transition to becoming a video channel (which they now claim serves more videos daily than YouTube), most of this content is stolen. (And the original owners of said content don't see any revenue either.)This, of course, makes total sense. More content, original or stolen, means more impressions means more revenue. It's in Facebook best interest to allow this culture of thieving to continue, because stolen content is easier to produce at volume.Creating original content requires resources. And why would you invest if there's little reward in doing so?I wonder out loud (like much of what I publish here) if we're not going to see some debate about our collective attitude to brand piracy on Facebook. Because the brands that create original content should be rewarded not punished....

My last post about my dislike of display advertising caused a bit of controversy. Perhaps I was too heavy-handed in my language, although sometimes you need to be in order to cut through. In any case, I thought I'd balance it out with something a little more rational this time: On average, more than 1 in 5 banner clicks do not reach the site. This is the result from analysis of more than 110,000 clicks across four industries, fifteen campaigns and a variety of adservers, publishers, targeting, devices and formats. Click data (as reported by the adserver) was compared directly to site visit data (as reported by Google Analytics) resulting in an average click-to-visit ratio. I'll be the first to suggest this is by no means a robust piece of research. It's a small sample of campaigns, and the spread of results ranged from as low as 37% in one case to more than 100% in another (meaning there were more visits than clicks). Without a doubt, it needs further investigation. But even still I wonder if it's not worth having a conversation about some data that suggests, on average, only 78% of banner clicks arrive at their intended destination. Especially if you're paying on a cost-per-click model. Perhaps as an industry we need to move away from clicks and click-through-rates to something like visits and visit-through-rate. Although we'd need to add another decimal place in our reports....

I've been interviewing lately and one of my favourite questions to ask is "What is your biggest fuck up?" One candidate asked me the same question, and this is what I told them. I used to write a monthly newsletter for my clients, updating them on some of the digital trends and changes in the industry. Each section had a paragraph on the news and a button to find out more. It was always proofread multiple times (nothing worse than sending a typo to a few dozen clients) but one time I forgot to include a link on a button before it was sent. As a placeholder to update later, I put in "xxx" rather than a URL in the hyperlink. But most browsers have a feature now where if a URL isn't properly formatted it will turn it into a search instead. Every client who clicked that button ended up Googling "xxx". So if you're proofreading something, check the links as well. And if you're going to use placeholder text, try "asdf" instead....