Online retailers and brands are inundated with vendors trying to sell them product data distribution platforms that are either self-service (basically glorified reformatting tools) or offer a semi-serviced solution – meaning that they provide support on how to use the platform, but not actually on how to make the product data any better.

So digital marketers and e-commerce managers are having to first learn the platform, before being capable of making meaningful improvements. But any channels can only really perform well when the feed that is uploaded is of high quality.

Of the thousands of feeds we see, only rarely do we come across one that would not massively benefit from augmentation and a supplementary website scrape. While some of the aforementioned platforms offer some additional scraping options, they scrape to make the platform work – but not with the focus of making the data perform better.

The truth of the matter is that many vendors are intentionally misleading their clients; they simply do not have the technology to scrape their client’s websites. By trying to cover up their lack of technical capability, they are selling their clients short and they are not providing them with the support that they need.

In a perfect world every transactional web site produces a perfect data feed, with all attributes consistent and nicely formatted. Unfortunately, in our more than 11 years’ experience, this is rarely ever the case. Here are some of the benefits of collecting data from the consumer facing web site:

  • Capturing all variants, colour, and size
  • Capturing promotional categorisation (a standard feed will only contain the main categories)
  • Capturing visual marketing efforts (Hero products, promotional bundles, seasonal bundles, etc.)
  • Capturing reviews and other social indicators that are usually added by third party plugins and not available through the back-end
  • Capturing all available images, not just the standard one
  • Capturing product highlights, additional information that is not included in the description.
  • Capturing all discounts, down to the product level, that some platforms do not support,
  • Etc.

We have conducted hundreds of feed audits over the years and while many retailers can produce a feed out of their system, most of them at least require augmentation, some require a full scrape to be able to trade online successfully and effectively.

An automated or supported platform will usually not ask about the above, they will work with what they are given thus the algorithms work on incomplete data that lacks some of the most important KPIs and sales drivers.

And since there is usually truly little growth consultation other than a Wiki the ecommerce manager is left alone to deal with the issues arising.

We have a good number of customers that come to us after having tried an automated platform and have become utterly disillusioned with the ‘one fits all’ approach and the lack of consultation.

Despite the negative propaganda being spread by some of our competitors, scraping is one of the most secure and reliable methods to get a complete and true representation of the data a consumer sees, including all visual and social marketing efforts. In fact, it is the only method to harness the otherwise missing data.

If you then blend in performance KPIs and have a regular dialogue with your data feed service provider you will see a notable improvement of performance, search relevance, and feed quality score.