Archive for the ‘News’

Salesforce adds social marketing power with $2.5 billion ExactTarget deal

July 11, 2013 By: admin Category: News

By Sayantani Ghosh

(Reuters) – Inc, the biggest maker of online sales management tools, said it would pay $2.5 billion for marketing software maker ExactTarget, which helps companies reach customers on social networks through mobile devices.

The acquisition — the biggest ever for — is its eighth in the past year and its second big purchase focused on social media. It acquired Buddy Media, which helps big brands manage Facebook and Twitter pages, in August.

ExactTarget provides internet-based marketing software used by businesses to personalize e-mail and text messages and to run social media ad campaigns. It has 6,000 customers, including Coca-Cola Co, Gap Inc and Nike Inc.

But investors took a dim view of’s latest purchase, pushing down its shares as much as 5 percent in early trading.

Big technology companies such as and Oracle Corp are seeking growth through acquisitions as their traditional businesses slow and smaller rivals take advantage of social media.

Considered the leader in cloud computing, posted lackluster quarterly earnings last month as costs rose following its acquisition spree.

Chief Executive Marc Benioff, who called the ExactTarget deal’s most important ever, said the company would now dramatically slow the pace of deal making.

“I really think what you are going to see is us taking a vacation from M&A for anywhere between probably 12 and 18 months,” he said on a conference call with analysts on Tuesday.

The acquisition will reduce’s operating cash flow by $75 million to $80 million in fiscal 2014. The company said it now expected fiscal 2014 cash growth in the low teens compared with its prior estimate of a low 20 percent range.


The offer price of $33.75 per share for ExactTarget represents a 53 percent premium to the stock’s closing on Monday on the New York Stock Exchange.

The offer is 6.6 times ExactTarget’s expected revenue for 2013, according to Thomson Reuters I/B/E/S.

ExactTarget shares, which prior to Tuesday’s offer had risen 16 percent since they listed on the New York Stock Exchange in March 2012, rose 53 percent in early trading to $33.73 on Tuesday. said its revenue for fiscal year ending January 2014 would increase by $120 million to $125 million as a result of the ExactTarget acquisition, which is expected to close in the quarter ending July 31.

The acquisition is expected to reduce fiscal 2014 earnings by about 16 cents per share, said., which had about $3.1 billion in cash and marketable securities at the end of the first quarter, said it would finance the transaction through cash on hand and a term loan. shares were down 3 percent at $39.77 just after the opening.

BofA Merrill Lynch was the financial adviser to and JP Morgan advised ExactTarget.

(Additional reporting by Chandni Doulatramani in Bangalore; Editing by Saumyadeb Chakrabarty)

Google Penguin, the Second (Major) Coming: How to Prepare

April 10, 2013 By: admin Category: News

Simon Penson

Unless you’ve had your head under a rock you’ve undoubtedly heard the rumblings of a coming Google Penguin update of significant proportions.

To paraphrase Google’s web spam lead Matt Cutts the algorithm filter has “iterated” to date but there will be a “next generation” coming that will have a major impact on SERPs.

Having watched the initial rollout take many by surprise it make sense this time to at least attempt to prepare for what may be lurking around the corner.

Google Penguin: What We Know So Far

We know that Penguin is purely a link quality filter that sits on top of the core algorithm, runs sporadically (the last official update was in October 2012), and is designed to take out sites that use manipulative techniques to improve search visibility.

And while there have been many examples of this being badly executed, with lots of site owners and SEO professionals complaining of injustice, it is clear that web spam engineers have collected a lot of information over recent months and have improved results in many verticals.

That means Google’s team is now on top of the existing data pile and testing output and as a result they are hungry for a major structural change to the way the filter works once again.

We also know that months of manual resubmissions and disavows have helped the Silicon Valley giant collect an unprecedented amount of data about the “bad neighborhoods” of links that had powered rankings until very recently, for thousands of high profile sites.

They have even been involved in specific and high profile web spam actions against sites like Interflora, working closely with internal teams to understand where links came from and watch closely as they were removed.

In short, Google’s new data pot makes most big data projects look like a school register! All the signs therefore point towards something much more intelligent and all encompassing.

The question is how can you profile your links and understand the probability of being impacted as a result when Penguin hits within the next few weeks or months?

Let’s look at several evidence-based theories.

The Link Graph – Bad Neighborhoods

Google knows a lot about what bad links look like now. They know where a lot of them live and they also understand their DNA.

And once they start looking it becomes pretty easy to spot the links muddying the waters.

The link graph is a kind of network graph and is made up of a series of “nodes” or clusters. Clusters form around IPs and as a result it becomes relatively easy to start to build a picture of ownership, or association.

Google assigns weight or authority to links using its own PageRank currency, but like any currency it is limited and that means that we all have to work hard to earn it from sites that have, over time, built up enough to go around.

This means that almost all sites that use “manipulative” authority to rank higher will be getting it from an area or areas of the link graph associated with other sites doing the same. PageRank isn’t limitless.

These “bad neighborhoods” can be “extracted” by Google, analyzed and dumped relatively easily to leave a graph that looks a little like this:

They won’t disappear, but Google will devalue them and remove them from the PageRank picture, rendering them useless.

Expect this process to accelerate now the search giant has so much data on “spammy links” and swathes of link profiles getting knocked out overnight.

The concern of course is that there will be collateral damage, but with any currency rebalancing, which is really what this process is, there will be winners and losers.

Link Velocity

Another area of interest at present is the rate at which sites acquire links. In recent months there definitely has been a noticeable change in how new links are being treated. While this is very much theory my view is that Google have become very good now at spotting link velocity “spikes” and anything out of the ordinary is immediately devalued.

Whether this is indefinitely or limited by time (in the same way “sandbox” works) I am not sure but there are definite correlations between sites that earn links consistently and good ranking increases. Those that earn lots quickly do not get the same relative effect.

And it would be relatively straightforward to move into the Penguin model, if it isn’t there already. The chart below shows an example of a “bumpy” link acquisition profile and as in the example anything above the “normalized” line could be devalued.

Link Trust

The “trust” of a link is also something of interest to Google. Quality is one thing (how much juice the link carries), but trust is entirely another thing.

Majestic SEO has captured this reality best with the launch of its new Citation and Trust flow metrics to help identify untrusted links.

How is trust measured? In simple terms it is about good and bad neighborhoods again.

In my view Google uses its Hilltop algorithm, which identifies so-called “expert documents” (websites) across the web, which are seen as shining beacons of trust and delight! The closer your site is to those documents the better the neighborhood. It’s a little like living on the “right” road.

If your link profile contains a good proportion of links from trusted sites then that will act as a “shield” from future updates and allow some slack for other links that are less trustworthy.

Social Signals

Many SEO pros believe that social signals will play a more significant role in the next iteration of Penguin.

While social authority, as it is becoming known, makes a lot of sense in some markets, it also has limitations. Many verticals see little to no social interaction and without big pots of social data a system that qualifies link quality by the number of social shares across site or piece of content can’t work effectively.

In the digital marketing industry it would work like a dream but for others it is a non-starter, for now. Google+ is Google’s attempt to fill that void and by forcing as many people as possible to work logged in they are getting everyone closer to Plus and the handing over of that missing data.

In principle it is possible though that social sharing and other signals may well be used in a small way to qualify link quality.

Anchor Text

Most SEO professionals will point to anchor text as the key telltale metric when it comes to identifying spammy link profiles. The first Penguin rollout would undoubtedly have used this data to begin drilling down into link quality.

I asked a few prominent SEO professionals their opinions on what the key indicator of spam was in researching this post and almost all pointed to anchor text.

“When I look for spam the first place I look is around exact match anchor text from websites with a DA (domain authority) of 30 or less,” said Distilled’s John Doherty. “That’s where most of it is hiding.”

His thoughts were backed up by Zazzle’s own head of search Adam Mason.

“Undoubtedly low value websites linking back with commercial anchors will be under scrutiny and I also always look closely at link trust,” Mason said.

The key is the relationship between branded and non-branded anchor text. Any natural profile would be heavily led by branded (e.g., and “white noise” anchors (e.g., “click here”, “website”, etc).

The allowable percentage is tightening. A recent study by Portent found that the percentage of “allowable” spammy links has been reducing for months now, standing at around 80 percent pre-Penguin and 50 percent by the end of last year. The same is true of exact match anchor text ratios.

Expect this to tighten even more as Google’s understanding of what natural “looks like” improves.


One area that will certainly be under the microscope as Google looks to improve its semantic understanding is relevancy. As it builds up a picture of relevant associations that data can be used to assign more weight to relevant links. Penguin will certainly be targeting links with no relevance in future.

Traffic Metrics

While traffic metrics probably fall more under Panda than Penguin, the lines between the two are increasingly blurring to a point where the two will shortly become indistinguishable. Panda has already been subsumed into the core algorithm and Penguin will follow.

On that basis Google could well look at traffic metrics such as visits from links and the quality of those visits based on user data.


No one is in a position to be able to accurately predict what the next coming will look like but what we can be certain of is that Google will turn the knife a little more making link building in its former sense a more risky tactic than ever. As numerous posts have pointed out in recent months it is now about earning those links by contributing and adding value via content.

If I was asked what my money was on, I would say we will see a tightening of what is an allowable level of spam still further, some attempt to begin measuring link authority by the neighborhood it comes from and any associated social signals that come with it. The rate at which links are earned too will come under more scrutiny and that means you should think about:

Understanding your link profile in much great detail. Tools and data from companies such as Majestic, Ahrefs, CognitiveSEO, and others will become more necessary to mitigate risk.
Where you link comes from not just what level of apparent “quality” it has. Link trust is now a key metric.
Increasing the use of brand and “white noise” anchor text to remove obvious exact and phrase match anchor text problems.
Looking for sites that receive a lot of social sharing relative to your niche and build those relationships.
Running back link checks on the site you get links from to ensure their equity isn’t coming from bad neighborhoods as that could pass to you.

This post originally appeared on

Google Running Feedback Experiment Similar to Human Quality Rater Test

December 19, 2012 By: admin Category: News

By: Matt McGee, December 19, 2012

Google Logo - Stock

Google has long asked searchers to provide feedback on the quality of its search results, and often runs a number of tests aimed at encouraging such feedback. The latest such experiment, which seems to have been live for at least a month or so, is a bit different because it asks searchers for feedback only on certain results.

Eli Schwartz recently shared with us a screenshot after doing a search for [product synonym]. As you can see below, Google is using the open space on the right of the results page to ask “Which result do you prefer?”


What’s most interesting is that Google isn’t asking for general feedback on the search results page as a whole; it’s specifically pulling out two of the results — and not the top two. In this case, Google is specifying the third and fourth links and asking the searcher to “visit both pages before choosing.”

That’s similar to the “side-by-side” tasks that Google’s army of human search quality raters often perform — a type of task that you can learn more about in these articles:

In that second article, you’ll find a screenshot of a “basic” side-by-side task that also asks the evaluator to view two specific pages and choose the better result.

The Street recently saw this same feedback form, and pointed out that the “Learn more” link at the bottom leads to a Google help page that says user feedback “will not directly influence the ranking of any single page,” and refers to how the data is used in conjunction with its “professional search evaluators.”

In a typical year, we experiment with tens of thousands of possible changes. These changes, whether minor or major, are tested extensively by professional search evaluators, who look at results and give us feedback, and “live traffic experiments” where we turn on a change for a portion of users. Testing helps us whittle down our list of changes to about 500 improvements per year.

Earlier this summer, Google was running a similar, but less specific, feedback test that asked searchers “How Satisfied Are You With These Results?” That survey asked for feedback on the full search results page.

Bing Adds People and Places to Snapshot

December 17, 2012 By: admin Category: News

By: Allison Howen, December 14, 2012

Bing’s Snapshot, which displays information such as maps, movie times and restaurant menus directly within search result pages, now includes two new categories – people and landmarks.

Snapshot is the middle column between Bing’s main search results and the social sidebar that resides on the right side of the results page. The feature, which was launched in June, displays relevant information related to search queries in order to make it easier for users to take actions, such as booking a hotel room, directly from the results page. However, according to Bing’s blog, the company ran thousands of experiments to determine what topics are most frequently searched for since the launch of Bing’s three-column redesign, and the data led Bing to include people and places to Snapshot’s functionality.

Now upon a search query for a famous person, celebrity or place, Bing displays relevant facts about the topic to make it easier for users to find exactly what they are looking for before clicking through to another website. Furthermore, Snapshot includes content such as reviews, movie trailers and links that make it easy for users to discover information with a glance or take actions, such as listen to or purchase music, with just one click.

It is also important to note that this isn’t the last update for Snapshot, because according to Bing’s blog, the company will be adding more categories in the coming weeks.

This post originally appeared on

Happy Holidays From the Systemtek Team

December 17, 2012 By: admin Category: News

Dear Reader,

All of us at Systemtek Technologies would like to take a moment to express our gratitude to our loyal customers, friends, and family.

As the Holiday Season is upon us, we find ourselves reflecting on the success of the past year. We hope that next year will be successful for you as well.

We wish you a very happy Holiday Season and a New Year filled with peace and prosperity.


John Li

6 AdWords Enhancements Offer More Insights, Efficiency

December 04, 2012 By: admin Category: Marketing Tips, News

By: Jason Tabeling, December 4, 2012

In paid search marketing, the goal at all times is to drive better results. You spend your time pouring over data looking for that one data point that will help you achieve those better results.

In order to focus your time on finding that data point you need two things:

  • Tools that drive efficiency so you can spend more time looking for that insight.
  • Access to incremental insights that provide you the ability to see new data points providing better opportunities to optimize.

Google has recently spent some time upgrading AdWords to provide tools that help on both the efficiency and insights categories. Here are a few that you might want to take advantage of.


  • Campaign Diagnostics: Now AdWords will alert you proactively if you’re about to make a change to your account that might have a negative reaction. For example, adding a negative keyword that will inadvertently block another keyword. This catch can really provide an extra level of insights that often get missed far too easily when you are trying to do the right thing to an account, but it doesn’t turn out that way.
  • Impression Share for Search & Display: Have you been wondering what percentage of display inventory you’ve been buying? Now AdWords will break out performance individually across both search and display. This is very helpful for the display side; however, you still shouldn’t run display and search ads out of the same campaign.


  • Email-only alerts with automated rules: Not fully comfortable with the system making some changes without your final, final approval? This tool will provide you with proactive insights about actions that can be taken given the rules you provide.


  • Keyword columns in search terms report: For a while now, AdWords’ search query report has been providing information about which search query triggered your keyword ad. However, now you will be able to quickly add that keyword through the same screen.


  • Faster editing across your account: AdWords is bringing a feature that has always been available in the Editor to the web – the ability to make bulk changes quickly. The ability to raise, lower, or make bulk changes becomes a great way to save time. Just be sure you’re thinking about the implications of big generic bid changes.
  • AdWords scripts: The ability to write some JavaScript code to help automate your ads with info from a pricing DB, update bids, etc. A helpful tool to create efficiency, and flexibility.

Overall, Google is doing a lot to help improve the system, and providing tools to help paid search marketers focus on driving improved results. Look for more to come based on feedback from the masses, and keep trying looking for those insights.

This post originally appeared on

Facebook Launches Conversion Measurement Tool

November 20, 2012 By: admin Category: Marketing Tips, News

By: Miranda Miller, November 20, 2012

Facebook began rolling out a conversion measurement tool on Friday to help marketers bridge the data gap between social ads and online sales.

David Baser, Facebook’s ads product manager, told Reuters the tool has been a highly requested feature for some time. “Measuring ad effectiveness and outcomes is absolutely crucial to all types of businesses and marketers,” he said.

“You would see the number of people who bought shoes,” he said, using the example of an online shoe retailer. However, marketers couldn’t get information that could identify the people, he added.

Third parties such as social shopping app maker Glimpse have been offering solutions to specific aspects of the social commerce “problem” for some time, particularly the disparate data sets available to online retailers.

“There are about 500 million products for sale and about 100 million (1 in 5) have a Like button next to them. Only about 3 million of those have ever had a Like,” Usher Liebermann of Glimpse told Search Engine Watch. “We see a lot of value in the data of knowing what someone has Liked. Once they give permission, we can see the products they Like, the stores and brands they Like, and the stores, brands and products their friends like.”

Glimpse uses their crawl data with Facebook’s Open Graph to gain more insight than is available from Facebook itself. “Say you like a shoe on Nordstrom’s site. Facebook knows you Like a product, but they don’t know it’s a shoe; only that it is a product and the page it came from,” Liebermann explained. “Nordstrom’s know something has been Liked. We have the contextual meta data around it… we actually know more about it than Facebook.” This allows Glimpse to analyze activity surrounding specific products and make social recommendations to users of the app.

Until now, however, retailers have struggled to directly attribute online purchases to Facebook ads activity. The new feature is as critical for Facebook as it is for retailers; they need to prove to advertisers that their ads are a worthy investment and to do that, measurable outcomes are key.

Still, advertisers saw improved ad performance in Q3 2012, with click-through rates more than making up for falling costs per click.

Social media revenue will reach $34 billion by 2016, according to Gartner analysts. Advertising and gaming account for approximately 90 percent of that revenue; in 2012, they accounted for $8.8 billion and $6.2 billion, respectively.

“New revenue opportunities will exist in social media, but no new services will be able to bring significant fresh revenue to social media by 2016,” said Neha Gupta, senior research analyst at Gartner. “The biggest impact of growth in social media is on the advertisers. In the short and medium terms, social media sites should deploy data analytic techniques that interrogate social networks to give marketers a more accurate picture of trends about consumers’ needs and preferences on a customized basis.”

The ability to attribute online sales to Facebook ads is certainly a big step in the right direction for advertisers. Facebook has not yet announced when the tool will be available to all marketers.

This post originally appeared on

Happy Thanksgiving!

November 19, 2012 By: admin Category: News

Thanksgiving is a family holiday full of love, sharing and giving thanks. It’s one day that you can forget about your job and responsibilities. The Systemtek team has a lot to be thankful for as well. We’re especially thankful for you, our customers.We hope that you can spend this special day with your family and friends in an atmosphere full of love and thanks.

The Systemtek team hopes you have a great Thanksgiving and may your business be full of customers.

Happy Thanksgiving!


The Systemtek Team

Facebook Testing Pinned Post For Groups

November 14, 2012 By: admin Category: Marketing Tips, News

By: David Cohen, November 13, 2012

Facebook is testing an expansion of the pinning feature on pages’ timelines to groups, allowing group administrators to pin multiple posts, unlike timeline for pages, where only one post can be pinned at a time.

Inside Facebook received confirmation of the test from Facebook, saying that it was unclear how long a pinned post would remain atop a group’s feed, whether those posts had to be manually unpinned, and if the test would expand into mobile (it has not, so far).

Have you noticed this feature in any Facebook groups you belong to?


Pages Feed Being Tested By Facebook

November 09, 2012 By: admin Category: Marketing Tips, News

By: Sean Carson, November 9, 2012

Facebook is rolling out with a “Pages Feed,” which will be a separate news stream for users. Right now it is slowly being rolled out to all users and it will appear on the left side of your home page. Access your page feed here.

This will allow users to see all of the pages that they “Like” in a separate stream. The pages that users have marked as “Liked” will still appear in their normal newsfeed.

You can ask your fans to visit the page feed to keep up with every single one of your updates. This page feed will contain all of the pages that they have “Liked” not just your own. This could help increase reach for the fans that will check the page feed to see updates, but again content, good content, is what truly drives reach. Here’s an example of the page feed:

There is a possibility that Facebook will allow you to set customized page feed lists in the future (a list of specific companies that you want to see in your page feed). It will be interesting to see what Facebook has in store with this new page feed update. There is the potential, though, that this will increase pages engagement and reach.

What do you think of the Page feed update?