Case Study

How a SaaS company powers its offering for product marketers using Contify’s News Feed APIs

Learn how a SaaS company powers its intelligence offering for product marketers to aggregate intelligence on their competitors using Contify’s News Feed APIs

Business Challenge

A SaaS company that builds software for product marketers to aggregate intelligence on their competitors was getting rave reviews from its customers – businesses loved how easily they could track competitors using this tool. But behind the scenes, they were grappling with a challenge. A core component of their offering was delivering intelligence from company websites. This included the website copy (positioning), messaging, blog posts, press releases, whitepapers, and other marketing collateral. Their users depended on this information to refine their own messaging, plan new features, and anticipate/ track changes in the offerings of their competitors.

Given the scale at which the SaaS company was adding new companies, it was becoming impossible to write scrapers for each website. Further, the scrapers stopped working whenever websites changed their design and therefore had to be continuously maintained. Adding new technology resources for maintenance was expensive. Outsourcing to freelancers wasn’t delivering the desired quality. They reached out to Contify to explore if our News APIs could help them aggregate such intel from company websites.

The Solution

Aggregating business-relevant information from company websites at scale is a complex undertaking. It requires a combination of intelligent systems, human intervention, and process know-how that is built on years of experience. There were several challenges that our APIs were able to address for the client:
  1. Scrapers for marketing/resources section – The first step was to configure scrapers for all the company websites that the SaaS company was already supporting. It also involved writing custom scrapers for some websites. The marketing sections of corporate websites are designed differently. They use cookies, JavaScript, AJAX calls, and other technologies to give a modern web experience to their users. Our comprehensive web scraping framework was able to take care of most websites. Wherever it failed, we extended our scraping framework for new website structures. Further, web scrapers break down frequently because modern websites are frequently changed to give fresh experience to the users. Our proprietary process for monitoring the health of web pages and scrapers flags any change on the page and from there our team gets to work making sure that there are no broken scrapers.
  2. Maintain scrapers and identify new ones – It’s easy for a human to navigate a website and find new content and pages. For example, you could browse a website and quickly find out that blogs are within the resources section or investor relations is within the company tab. But for an automated program, it’s not that simple. As the structure of each website is different, we’ve built intelligent programs that have learnt to identify relevant pages based on pattern recognition and machine learning. We’ve developed an indigenous system based on machine learning that flags a new section and identifies web pages that are not being updated. These are then manually reviewed by an analyst and the relevant ones are integrated with Contify.
  3. Contextual tagging – The aggregated content has to be tagged properly so that it can be ingested at the right places in the workflow. The technology works well for objective tags like the name of a company or location etc. But a piece of information can be considered intelligence if it provides context. In the case of company intelligence, that context comes from what the update is about. Is the new blog post about inbound marketing or is it discussing cryptocurrencies? Contify handled such contextual tagging by combining machine learning tech with human curation.

Learn more about Contify


By using the Contify API, the client has been able to provide intelligence from the company website at scale. In addition, they saved on significant costs that they would have incurred had they built the technology infrastructure to aggregate company intelligence on their own. The client currently tracks thousands of company websites and are adding new ones every day.

  • 2000+

    Delivering intelligence from 2000+ website.