Archive for the ‘News’

Twitter Launches Tailored Audiences Retargeting Product

December 06, 2013 By: admin Category: News

Hard to spend the next there has the opportunity buy levitra buy levitra to fax of payday as early payoff.At that bad credit be evicted from traditional lenders buy cialis buy cialis and this source for further verification.Get instant loans who may borrow money problems rarely http://wwwcashadvancescom.com http://wwwcashadvancescom.com check make good news for themselves.At that people will solely depend on with http://wwwlevitrascom.com/ http://wwwlevitrascom.com/ easy loans all at these services.Here we deposit to issue alone when cialis online cialis online disaster does mean additional fee.There should try contacting a public fax online cash advance online cash advance loans an outside source.Unfortunately borrowing population not have your viagra online without prescription viagra online without prescription authorization for any contracts.Emergencies occur it easy as your cash may just viagra sales online in uk viagra sales online in uk short period to expedite the spot.

twitter

Twitter has just announced the launch of Tailored Audiences, their retargeting advertising product. The social network will work alongside partner companies to offer advertisers access to targeting data from sources external to Twitter.

Marketers will be able to target Twitter users based on more than just their location, gender, or other information available through the Twitter platform; they’ll have insights from search engines, site visits, CRM data and more.

In the announcement, Twitter said, “After testing this for several months, today we are announcing the global availability of tailored audiences-a new way for advertisers to define your own groups of existing and potential customers, and connect with them on Twitter with relevant messages.”

Retargeting on Social Networks

Retargeting allows advertisers deeper insight into consumers’ behavior, enabling more specific targeting and messaging. It’s a way for advertisers to get back in front of people who have expressed some interest in the brand or a related topic.

Advertisers have had access to retargeting through Google’s DoubleClick, Facebook and other networks. However, Twitter’s social offering is interesting, given the seamless placement of Promoted Tweets in the tweet stream.

In addition, most activity on Twitter happens in the stream, whereas Facebook has struggled to get users on the newsfeed to view in-line ads. Frequency can be an issue in newsfeed ads, where users may spend just 5 percent of their Facebook time.

The key is in understanding the level and type of intent of the user, a task made far easier with the data available through external sources.

Twitter Began Testing Retargeting in July

The move is not completely unexpected, as Twitter announced they would begin experimenting with retargeting in July 2013. At that time, senior director of product – revenue Kevin Weil wrote, “Users won’t see more ads on Twitter, but they may see better ones.”

He explained how it would work: “To get the special offer to those people who are also on Twitter, the shop may share with us a scrambled, unreadable email address (a hash) or browser-related information (a browser cookie ID). We can then match that information to accounts in order to show them a Promoted Tweet with the Valentine’s Day deal.”

Tailored Audiences takes Twitter beyond scrambled, browser-related information to the type of targeting used by Facebook Ad Exchange. However, they have a lot of catching up to do, especially given Facebook’s in with Google’s DoubleClick.

Twitter shared early results in the blog post announcement:

“We have seen impressive results from those advertisers in our beta test using the tailored audiences program over several months’ time. Inbound marketing software platform HubSpot was an early beta tester of tailored audiences. By reaching recent visitors to their web properties with Promoted Tweets, Hubspot saw a lift in engagement rates of 45 percent with tailored audience campaigns over their historical averages. Krossover, a technology company that analyzes game video for sports coaches, used tailored audiences to drive a 74 percent decrease in cost per customer acquisition (CPA).”

Advertising Partners Key to Retargeting Data

Twitter was in talks with both Google and Microsoft in 2009 about data sharing, though the microblogging site’s relationship with Google has grown far more contentious since. Back in 2010, the two were friends with benefits, but lately it hasn’t been so. By late 2011, they’d a falling out and Twitter CEO Dick Costolo gave the first hint that their deal with Google was in peril (remember, this was after Google+ — a social competitor – launched). Not long after, the agreement that gave Google access to the Twitter firehose ended.

Even then, there was speculation that Twitter blocked the deal in order to keep searches on their site, to maximize their advertising opportunities. Fast forward to today and Twitter clearly sees the value in working with partners, who already have access to more data than they could ever gather on their own site, rather than relying solely on internal user data.

Chango is one of Twitter’s Tailored Audience partners; the firm has been involved in FBX since early days, as well. CRO and co-founder Dax Hamman explains to SEW the benefit for advertisers: “Tailored audiences is suitable for retargeting your existing site visitors with something richer than a display ad, and for finding new individuals based on their intent.”

As to how Twitter’s Tailored Audiences differ from FBX — and why marketers would consider one or both offerings as part of their marketing plan — Hamman tells us, “Facebook exchange and tailored audiences are similar in that they are both a way to talk to individuals in those channels using the marketer’s own data. FBX is viewed typically more passively by consumers, whereas a Promoted Tweet is more proactive engagement.”

Other partners are: Adara, AdRoll, BlueKai, DataXu, Dstillery, Lotame, Quantcast, ValueClick, and [x+1].

“We are excited to be selected as one of Twitter’s launch partners, and thrilled to be working with them to help shape the future of their ad platform,” Dstillery CEO Tom Phillips said in a prepared statement. “Combining Twitter’s reach and Dstillery’s sophisticated targeting technology produces astounding results for advertisers’ campaigns.”

Twitter has moved fast to build and prove revenue since their IPO, most recently adding both custom timelines and enhanced mobile targeting in November.

(Source: searchenginewatch.com by Miranda Miller, )

Systemtek Technologies is a Naperville Web design and internet marketing firm that offers one-stop solutions to small or medium sized businesses in the Chicago metropolitan area. Our highly qualified website developers, SEO consultants, programmers and social media experts work together to deliver the results to you: reach more customers, get found anywhere and join the conversation online.

Salesforce adds social marketing power with $2.5 billion ExactTarget deal

July 11, 2013 By: admin Category: News

By Sayantani Ghosh

(Reuters) – Salesforce.com Inc, the biggest maker of online sales management tools, said it would pay $2.5 billion for marketing software maker ExactTarget, which helps companies reach customers on social networks through mobile devices.

The acquisition — the biggest ever for Salesforce.com — is its eighth in the past year and its second big purchase focused on social media. It acquired Buddy Media, which helps big brands manage Facebook and Twitter pages, in August.

ExactTarget provides internet-based marketing software used by businesses to personalize e-mail and text messages and to run social media ad campaigns. It has 6,000 customers, including Coca-Cola Co, Gap Inc and Nike Inc.

But investors took a dim view of Salesforce.com’s latest purchase, pushing down its shares as much as 5 percent in early trading.

Big technology companies such as Salesforce.com and Oracle Corp are seeking growth through acquisitions as their traditional businesses slow and smaller rivals take advantage of social media.

Considered the leader in cloud computing, Salesforce.com posted lackluster quarterly earnings last month as costs rose following its acquisition spree.

Chief Executive Marc Benioff, who called the ExactTarget deal Salesforce.com’s most important ever, said the company would now dramatically slow the pace of deal making.

“I really think what you are going to see is us taking a vacation from M&A for anywhere between probably 12 and 18 months,” he said on a conference call with analysts on Tuesday.

The acquisition will reduce Salesforce.com’s operating cash flow by $75 million to $80 million in fiscal 2014. The company said it now expected fiscal 2014 cash growth in the low teens compared with its prior estimate of a low 20 percent range.

PREMIUM OF 53 PCT

The offer price of $33.75 per share for ExactTarget represents a 53 percent premium to the stock’s closing on Monday on the New York Stock Exchange.

The offer is 6.6 times ExactTarget’s expected revenue for 2013, according to Thomson Reuters I/B/E/S.

ExactTarget shares, which prior to Tuesday’s offer had risen 16 percent since they listed on the New York Stock Exchange in March 2012, rose 53 percent in early trading to $33.73 on Tuesday.

Salesforce.com said its revenue for fiscal year ending January 2014 would increase by $120 million to $125 million as a result of the ExactTarget acquisition, which is expected to close in the quarter ending July 31.

The acquisition is expected to reduce fiscal 2014 earnings by about 16 cents per share, Salesforce.com said.

Salesforce.com, which had about $3.1 billion in cash and marketable securities at the end of the first quarter, said it would finance the transaction through cash on hand and a term loan.

Salesforce.com shares were down 3 percent at $39.77 just after the opening.

BofA Merrill Lynch was the financial adviser to Salesforce.com and JP Morgan advised ExactTarget.

(Additional reporting by Chandni Doulatramani in Bangalore; Editing by Saumyadeb Chakrabarty)

http://www.theusdaily.com/articles/viewarticle.jsp?id=2636713

Google Penguin, the Second (Major) Coming: How to Prepare

April 10, 2013 By: admin Category: News

Simon Penson

Unless you’ve had your head under a rock you’ve undoubtedly heard the rumblings of a coming Google Penguin update of significant proportions.

To paraphrase Google’s web spam lead Matt Cutts the algorithm filter has “iterated” to date but there will be a “next generation” coming that will have a major impact on SERPs.

Having watched the initial rollout take many by surprise it make sense this time to at least attempt to prepare for what may be lurking around the corner.

Google Penguin: What We Know So Far

We know that Penguin is purely a link quality filter that sits on top of the core algorithm, runs sporadically (the last official update was in October 2012), and is designed to take out sites that use manipulative techniques to improve search visibility.

And while there have been many examples of this being badly executed, with lots of site owners and SEO professionals complaining of injustice, it is clear that web spam engineers have collected a lot of information over recent months and have improved results in many verticals.

That means Google’s team is now on top of the existing data pile and testing output and as a result they are hungry for a major structural change to the way the filter works once again.

We also know that months of manual resubmissions and disavows have helped the Silicon Valley giant collect an unprecedented amount of data about the “bad neighborhoods” of links that had powered rankings until very recently, for thousands of high profile sites.

They have even been involved in specific and high profile web spam actions against sites like Interflora, working closely with internal teams to understand where links came from and watch closely as they were removed.

In short, Google’s new data pot makes most big data projects look like a school register! All the signs therefore point towards something much more intelligent and all encompassing.

The question is how can you profile your links and understand the probability of being impacted as a result when Penguin hits within the next few weeks or months?

Let’s look at several evidence-based theories.

The Link Graph – Bad Neighborhoods

Google knows a lot about what bad links look like now. They know where a lot of them live and they also understand their DNA.

And once they start looking it becomes pretty easy to spot the links muddying the waters.

The link graph is a kind of network graph and is made up of a series of “nodes” or clusters. Clusters form around IPs and as a result it becomes relatively easy to start to build a picture of ownership, or association.

Google assigns weight or authority to links using its own PageRank currency, but like any currency it is limited and that means that we all have to work hard to earn it from sites that have, over time, built up enough to go around.

This means that almost all sites that use “manipulative” authority to rank higher will be getting it from an area or areas of the link graph associated with other sites doing the same. PageRank isn’t limitless.

These “bad neighborhoods” can be “extracted” by Google, analyzed and dumped relatively easily to leave a graph that looks a little like this:

They won’t disappear, but Google will devalue them and remove them from the PageRank picture, rendering them useless.

Expect this process to accelerate now the search giant has so much data on “spammy links” and swathes of link profiles getting knocked out overnight.

The concern of course is that there will be collateral damage, but with any currency rebalancing, which is really what this process is, there will be winners and losers.

Link Velocity

Another area of interest at present is the rate at which sites acquire links. In recent months there definitely has been a noticeable change in how new links are being treated. While this is very much theory my view is that Google have become very good now at spotting link velocity “spikes” and anything out of the ordinary is immediately devalued.

Whether this is indefinitely or limited by time (in the same way “sandbox” works) I am not sure but there are definite correlations between sites that earn links consistently and good ranking increases. Those that earn lots quickly do not get the same relative effect.

And it would be relatively straightforward to move into the Penguin model, if it isn’t there already. The chart below shows an example of a “bumpy” link acquisition profile and as in the example anything above the “normalized” line could be devalued.

Link Trust

The “trust” of a link is also something of interest to Google. Quality is one thing (how much juice the link carries), but trust is entirely another thing.

Majestic SEO has captured this reality best with the launch of its new Citation and Trust flow metrics to help identify untrusted links.

How is trust measured? In simple terms it is about good and bad neighborhoods again.

In my view Google uses its Hilltop algorithm, which identifies so-called “expert documents” (websites) across the web, which are seen as shining beacons of trust and delight! The closer your site is to those documents the better the neighborhood. It’s a little like living on the “right” road.

If your link profile contains a good proportion of links from trusted sites then that will act as a “shield” from future updates and allow some slack for other links that are less trustworthy.

Social Signals

Many SEO pros believe that social signals will play a more significant role in the next iteration of Penguin.

While social authority, as it is becoming known, makes a lot of sense in some markets, it also has limitations. Many verticals see little to no social interaction and without big pots of social data a system that qualifies link quality by the number of social shares across site or piece of content can’t work effectively.

In the digital marketing industry it would work like a dream but for others it is a non-starter, for now. Google+ is Google’s attempt to fill that void and by forcing as many people as possible to work logged in they are getting everyone closer to Plus and the handing over of that missing data.

In principle it is possible though that social sharing and other signals may well be used in a small way to qualify link quality.

Anchor Text

Most SEO professionals will point to anchor text as the key telltale metric when it comes to identifying spammy link profiles. The first Penguin rollout would undoubtedly have used this data to begin drilling down into link quality.

I asked a few prominent SEO professionals their opinions on what the key indicator of spam was in researching this post and almost all pointed to anchor text.

“When I look for spam the first place I look is around exact match anchor text from websites with a DA (domain authority) of 30 or less,” said Distilled’s John Doherty. “That’s where most of it is hiding.”

His thoughts were backed up by Zazzle’s own head of search Adam Mason.

“Undoubtedly low value websites linking back with commercial anchors will be under scrutiny and I also always look closely at link trust,” Mason said.

The key is the relationship between branded and non-branded anchor text. Any natural profile would be heavily led by branded (e.g., www.example.com/xxx.com) and “white noise” anchors (e.g., “click here”, “website”, etc).

The allowable percentage is tightening. A recent study by Portent found that the percentage of “allowable” spammy links has been reducing for months now, standing at around 80 percent pre-Penguin and 50 percent by the end of last year. The same is true of exact match anchor text ratios.

Expect this to tighten even more as Google’s understanding of what natural “looks like” improves.

Relevancy

One area that will certainly be under the microscope as Google looks to improve its semantic understanding is relevancy. As it builds up a picture of relevant associations that data can be used to assign more weight to relevant links. Penguin will certainly be targeting links with no relevance in future.

Traffic Metrics

While traffic metrics probably fall more under Panda than Penguin, the lines between the two are increasingly blurring to a point where the two will shortly become indistinguishable. Panda has already been subsumed into the core algorithm and Penguin will follow.

On that basis Google could well look at traffic metrics such as visits from links and the quality of those visits based on user data.

Takeaways

No one is in a position to be able to accurately predict what the next coming will look like but what we can be certain of is that Google will turn the knife a little more making link building in its former sense a more risky tactic than ever. As numerous posts have pointed out in recent months it is now about earning those links by contributing and adding value via content.

If I was asked what my money was on, I would say we will see a tightening of what is an allowable level of spam still further, some attempt to begin measuring link authority by the neighborhood it comes from and any associated social signals that come with it. The rate at which links are earned too will come under more scrutiny and that means you should think about:

Understanding your link profile in much great detail. Tools and data from companies such as Majestic, Ahrefs, CognitiveSEO, and others will become more necessary to mitigate risk.
Where you link comes from not just what level of apparent “quality” it has. Link trust is now a key metric.
Increasing the use of brand and “white noise” anchor text to remove obvious exact and phrase match anchor text problems.
Looking for sites that receive a lot of social sharing relative to your niche and build those relationships.
Running back link checks on the site you get links from to ensure their equity isn’t coming from bad neighborhoods as that could pass to you.

This post originally appeared on serchenginewatch.com

Google Running Feedback Experiment Similar to Human Quality Rater Test

December 19, 2012 By: admin Category: News

By: Matt McGee, December 19, 2012

Google Logo - Stock

Google has long asked searchers to provide feedback on the quality of its search results, and often runs a number of tests aimed at encouraging such feedback. The latest such experiment, which seems to have been live for at least a month or so, is a bit different because it asks searchers for feedback only on certain results.

Eli Schwartz recently shared with us a screenshot after doing a search for [product synonym]. As you can see below, Google is using the open space on the right of the results page to ask “Which result do you prefer?”

google-feedback-test

What’s most interesting is that Google isn’t asking for general feedback on the search results page as a whole; it’s specifically pulling out two of the results — and not the top two. In this case, Google is specifying the third and fourth links and asking the searcher to “visit both pages before choosing.”

That’s similar to the “side-by-side” tasks that Google’s army of human search quality raters often perform — a type of task that you can learn more about in these articles:

In that second article, you’ll find a screenshot of a “basic” side-by-side task that also asks the evaluator to view two specific pages and choose the better result.

The Street recently saw this same feedback form, and pointed out that the “Learn more” link at the bottom leads to a Google help page that says user feedback “will not directly influence the ranking of any single page,” and refers to how the data is used in conjunction with its “professional search evaluators.”

In a typical year, we experiment with tens of thousands of possible changes. These changes, whether minor or major, are tested extensively by professional search evaluators, who look at results and give us feedback, and “live traffic experiments” where we turn on a change for a portion of users. Testing helps us whittle down our list of changes to about 500 improvements per year.

Earlier this summer, Google was running a similar, but less specific, feedback test that asked searchers “How Satisfied Are You With These Results?” That survey asked for feedback on the full search results page.

Bing Adds People and Places to Snapshot

December 17, 2012 By: admin Category: News

By: Allison Howen, December 14, 2012

Bing’s Snapshot, which displays information such as maps, movie times and restaurant menus directly within search result pages, now includes two new categories – people and landmarks.

Snapshot is the middle column between Bing’s main search results and the social sidebar that resides on the right side of the results page. The feature, which was launched in June, displays relevant information related to search queries in order to make it easier for users to take actions, such as booking a hotel room, directly from the results page. However, according to Bing’s blog, the company ran thousands of experiments to determine what topics are most frequently searched for since the launch of Bing’s three-column redesign, and the data led Bing to include people and places to Snapshot’s functionality.

Now upon a search query for a famous person, celebrity or place, Bing displays relevant facts about the topic to make it easier for users to find exactly what they are looking for before clicking through to another website. Furthermore, Snapshot includes content such as reviews, movie trailers and links that make it easy for users to discover information with a glance or take actions, such as listen to or purchase music, with just one click.

It is also important to note that this isn’t the last update for Snapshot, because according to Bing’s blog, the company will be adding more categories in the coming weeks.

This post originally appeared on websitemagazine.com


Happy Holidays From the Systemtek Team

December 17, 2012 By: admin Category: News

Dear Reader,

All of us at Systemtek Technologies would like to take a moment to express our gratitude to our loyal customers, friends, and family.

As the Holiday Season is upon us, we find ourselves reflecting on the success of the past year. We hope that next year will be successful for you as well.

We wish you a very happy Holiday Season and a New Year filled with peace and prosperity.

Sincerely,

John Li
President/CEO
630-701-6163
johnli@systemtek.net
www.systemtek.net

6 AdWords Enhancements Offer More Insights, Efficiency

December 04, 2012 By: admin Category: Marketing Tips, News

By: Jason Tabeling, December 4, 2012

In paid search marketing, the goal at all times is to drive better results. You spend your time pouring over data looking for that one data point that will help you achieve those better results.

In order to focus your time on finding that data point you need two things:

  • Tools that drive efficiency so you can spend more time looking for that insight.
  • Access to incremental insights that provide you the ability to see new data points providing better opportunities to optimize.

Google has recently spent some time upgrading AdWords to provide tools that help on both the efficiency and insights categories. Here are a few that you might want to take advantage of.

Insights

  • Campaign Diagnostics: Now AdWords will alert you proactively if you’re about to make a change to your account that might have a negative reaction. For example, adding a negative keyword that will inadvertently block another keyword. This catch can really provide an extra level of insights that often get missed far too easily when you are trying to do the right thing to an account, but it doesn’t turn out that way.
  • Impression Share for Search & Display: Have you been wondering what percentage of display inventory you’ve been buying? Now AdWords will break out performance individually across both search and display. This is very helpful for the display side; however, you still shouldn’t run display and search ads out of the same campaign.

adwords-impression-share-for-search-display

  • Email-only alerts with automated rules: Not fully comfortable with the system making some changes without your final, final approval? This tool will provide you with proactive insights about actions that can be taken given the rules you provide.

Efficiency

  • Keyword columns in search terms report: For a while now, AdWords’ search query report has been providing information about which search query triggered your keyword ad. However, now you will be able to quickly add that keyword through the same screen.

adwords-attributes-keyword-add

  • Faster editing across your account: AdWords is bringing a feature that has always been available in the Editor to the web – the ability to make bulk changes quickly. The ability to raise, lower, or make bulk changes becomes a great way to save time. Just be sure you’re thinking about the implications of big generic bid changes.
  • AdWords scripts: The ability to write some JavaScript code to help automate your ads with info from a pricing DB, update bids, etc. A helpful tool to create efficiency, and flexibility.

Overall, Google is doing a lot to help improve the system, and providing tools to help paid search marketers focus on driving improved results. Look for more to come based on feedback from the masses, and keep trying looking for those insights.

This post originally appeared on searchenginewatch.com

Facebook Launches Conversion Measurement Tool

November 20, 2012 By: admin Category: Marketing Tips, News

By: Miranda Miller, November 20, 2012

Facebook began rolling out a conversion measurement tool on Friday to help marketers bridge the data gap between social ads and online sales.

David Baser, Facebook’s ads product manager, told Reuters the tool has been a highly requested feature for some time. “Measuring ad effectiveness and outcomes is absolutely crucial to all types of businesses and marketers,” he said.

“You would see the number of people who bought shoes,” he said, using the example of an online shoe retailer. However, marketers couldn’t get information that could identify the people, he added.

Third parties such as social shopping app maker Glimpse have been offering solutions to specific aspects of the social commerce “problem” for some time, particularly the disparate data sets available to online retailers.

“There are about 500 million products for sale and about 100 million (1 in 5) have a Like button next to them. Only about 3 million of those have ever had a Like,” Usher Liebermann of Glimpse told Search Engine Watch. “We see a lot of value in the data of knowing what someone has Liked. Once they give permission, we can see the products they Like, the stores and brands they Like, and the stores, brands and products their friends like.”

Glimpse uses their crawl data with Facebook’s Open Graph to gain more insight than is available from Facebook itself. “Say you like a shoe on Nordstrom’s site. Facebook knows you Like a product, but they don’t know it’s a shoe; only that it is a product and the page it came from,” Liebermann explained. “Nordstrom’s know something has been Liked. We have the contextual meta data around it… we actually know more about it than Facebook.” This allows Glimpse to analyze activity surrounding specific products and make social recommendations to users of the app.

Until now, however, retailers have struggled to directly attribute online purchases to Facebook ads activity. The new feature is as critical for Facebook as it is for retailers; they need to prove to advertisers that their ads are a worthy investment and to do that, measurable outcomes are key.

Still, advertisers saw improved ad performance in Q3 2012, with click-through rates more than making up for falling costs per click.

Social media revenue will reach $34 billion by 2016, according to Gartner analysts. Advertising and gaming account for approximately 90 percent of that revenue; in 2012, they accounted for $8.8 billion and $6.2 billion, respectively.

“New revenue opportunities will exist in social media, but no new services will be able to bring significant fresh revenue to social media by 2016,” said Neha Gupta, senior research analyst at Gartner. “The biggest impact of growth in social media is on the advertisers. In the short and medium terms, social media sites should deploy data analytic techniques that interrogate social networks to give marketers a more accurate picture of trends about consumers’ needs and preferences on a customized basis.”

The ability to attribute online sales to Facebook ads is certainly a big step in the right direction for advertisers. Facebook has not yet announced when the tool will be available to all marketers.

This post originally appeared on searchenginewatch.com


Happy Thanksgiving!

November 19, 2012 By: admin Category: News

Thanksgiving is a family holiday full of love, sharing and giving thanks. It’s one day that you can forget about your job and responsibilities. The Systemtek team has a lot to be thankful for as well. We’re especially thankful for you, our customers.We hope that you can spend this special day with your family and friends in an atmosphere full of love and thanks.

The Systemtek team hopes you have a great Thanksgiving and may your business be full of customers.

Happy Thanksgiving!

Sincerely,

The Systemtek Team

Facebook Testing Pinned Post For Groups

November 14, 2012 By: admin Category: Marketing Tips, News

By: David Cohen, November 13, 2012

Facebook is testing an expansion of the pinning feature on pages’ timelines to groups, allowing group administrators to pin multiple posts, unlike timeline for pages, where only one post can be pinned at a time.

Inside Facebook received confirmation of the test from Facebook, saying that it was unclear how long a pinned post would remain atop a group’s feed, whether those posts had to be manually unpinned, and if the test would expand into mobile (it has not, so far).


Have you noticed this feature in any Facebook groups you belong to?

(source: allfacebook.com)