Rich Julius: Blogging Writes

On the Digital Future of Content

Navigation Menu

Search Engine Marketing (SEM) Tips: Analytics and Performance

Posted on Feb 21, 2013 in Digital Marketing, More Visitors: SEO & SEM, Technology & Analytics | 0 comments

John Wanamaker (1838-1922), considered by some at the father of modern advertising, is often quoted in marketing circles:  “Half the money I spend on advertising is wasted; the trouble is I don’t know which half.” A hundred years later, we no longer have that problem. In the age of digital marketing, with exceptional tracking systems (that stretch the very definitions of privacy), it’s no longer a matter of not knowing, but of investing the effort to find out. Today, savvy online marketers are saying “no campaigns without metrics.” SEM is a perfect example of this principle in action—a principle that can also be applied to email and social marketing. When you drive traffic to a web or mobile site using SEM, the basics of managing pay-per-click include an analysis of the costs for each keyword, deciding what to bid on, and then monitoring click-through reports to check performance and to see if you get out-bid on your keywords (in which case you may need to increase your bid or lose that keyword). But the basic reports in Google and Bing are about how much traffic you receive for what you spent. They don’t tell you how individual keywords are performing in terms of your business and organizational goals. What you don’t want to do is simply point to high SEM traffic and say you were successful. What if your top performing keywords are just draining your budget and delivering no value? You need to monitor what your visitors are doing, based on which keyword they came in on. A good analytics program will tell you not just which keywords get the most clicks (and thus cost you the most), but you’ll also gain insights into whether visitors are coming through but then immediately leaving (SEM cost with no value), or which pages they visit, in which order, and do they return to your site later (SEM cost with high value). About 10 years ago I developed a system for tracking campaign and keyword performance from Google AdWords and Overture (Overture was bought in 2003 by Yahoo, and Yahoo SEM is now merged with Bing). The idea was simple: if...

Read More

Newsroom Software: WordPress and Other Open Source Options

Posted on Apr 23, 2012 in Technology & Analytics | 0 comments

Some publishers, including thought leaders like John Paton, have cited the potential of using Open Source software to develop low-cost digital newsrooms. In the blog of the Journal Register’s Ben Franklin Project, they write: “we will be using only free web-based tools” and they then deliver a catalogue of such tools. T. S. Eliot, when speaking of the “Free Verse” movement, wrote “No verse is free for the man who wants to do a good job.” In Silicon Valley there has long been a similar saying that “Free Software is never really free.” The issue at hand is what business and finance folks call TCO: Total Cost of Ownership. TCO is the measure of the true cost of a system, including acquisition, customization, support, maintenance, training, and several other cost factors. Open Source software is also known as Free Software, as in the Free Software Foundation, the non-profit body that supports the Open Source movement and who wrote the most common Open Source license, the General Public License or GPL. But “free” refers to the freedom to use and distribute the software; it does not actually refer to price. “The word free in the term free software refers to freedom (liberty) and is not at all related to monetary cost.” ( For newsrooms, “free” Open Source software may well be the most expensive option available. Now, before I get too deep into the issues with Open Source newsrooms, I want to go on record as saying that I am a proponent of Open Source software myself. I love WordPress, and all its fun plug-ins (look, I am blogging in it right now). My company’s web site runs on WordPress and our cloud-based newsroom software runs on the free Open Source Linux/MySQL platform. I am among other things, a tech geek with a team of open source developers. I even write code myself, when the developers aren’t looking. I can afford to use Open Source. But most small newsrooms (and a lot of larger ones) do not have the technical depth to use Open Source, software, and if they do, that technical depth is part of the cost of an...

Read More

Don’t Build Your Own Newsroom

Posted on Mar 22, 2012 in Technology & Analytics | 0 comments

I recently spoke to several publishers at Publishing Expo in London about their content management systems, and a few gave me the usual sad story about trying to build a newsroom in-house. This is more than a “build vs. buy” issue, especially for newsrooms where the CMS is critical to the entire business operation. Failure in the newsroom is not only expensive, it can spell disaster for the company as a whole. IT project failure rates are the stuff of legend. The Standish Group, a Boston-based IT consulting firm, is famous for its annual CHAOS Report, documenting IT project failure rates (which hover around a shocking 30% success rate, with over 2/3 of the projects “Failed” or “Challenged” according to the report criteria. Statistics like these are well known to the IT and bespoke development community in the corporate IT world. But lessons from corporate IT don’t seem to have trickled into the publishing world, so let me offer a few insights into project failure and why you don’t want to build your own newsroom. First, newsroom systems are complex pieces of software; they are not simple document management systems of the kind a clever IT organization could reasonably be expected to design and build. Today’s newsrooms manage workflow, multimedia management, complex information architectures, analytics, multi-platform delivery, social media integration, and a host of other features that make them pretty complicated machines. But instead of going into the complexities of these systems, the more important insight is in understanding the difference between software developers and IT staff. A software developer is an engineer; they are trained to design and build software. IT professionals are trained to install, manage, customize, and integrate software, but they are not trained to conceive, design, and build it. The best analogy is from the automotive industry: software engineers are like automotive engineers, the folks who design cars and motorcycles for manufacturers. IT pros are mechanics, the folks who repair, maintain, and sometimes modify the cars and motorcycles designed by automotive engineers. Software engineers generally go to college to learn engineering or computer science. IT staff may well have engineering degrees, but many are vocationally...

Read More

Site Analytics: Intelligence Gathering for News Sites

Posted on Feb 21, 2012 in Technology & Analytics | 0 comments

Analytics Overview Late last summer the good folks at MSN invited me to give a talk on “Conducting Effective Market Landscape Assessments and Intelligence Gathering” where I discussed techniques for gathering competitive intelligence for One of the important topics I covered was how analytics–research based on observational data gathering–is critical to understanding visitor and market behavior. There are actually a few types of analytics. Web Server Log Reports (site statistics) are the oldest form of analytics, typified by products like WebTrends and a host of Open Source products like AWStats. These systems report on the data collected in log files maintained by your webserver, logs that track the time and date stamp of every web page and every image served up by the webserver.  Site stats deliver reports such as most popular pages on your site, top entry pages, top exit pages, overall pages served, and overall number of “hits” (the number of resources served by the webserver, now considered an almost meaningless metric, since these days a single web page with 4 javascript calls and 12 images represents 17 hits). Path Analysis is used to track every individual who comes to a site, and every page they visit. This type of analysis, performed by products like Adobe/Omniture SiteCatalyst and iMedia Analytics, collects a lot of data and delivers a lot of intelligence, including: heat maps, reports that tell you which links on any given page are getting the most clicks, page-dotting, the tracking of every variable in the visitor’s web site session, which can tell you things like which items they abandoned in their shopping cart, click-path analysis, or the most common paths that users are taking through your site, providing insights into things like user interface strengths and deficiencies, and real-time story trending, the ability to see almost instantly which of your stories is “going viral” and which are languishing. Another category is what I call Broad-spectrum External Analytics, which collect a certain amount of page data (generally less than the other methods, but still enough to deliver powerful reports), by adding a bit of code to your web page that sends data to an external, third...

Read More

Copyright © 2011-2015 Rich Julius · All Rights Reserved · Blogging Writes · Google+