Advanced Tech SEO for News Publishers with Barry Adams - Webinar

Topic: Advanced Tech SEO for News Publishers

Host: John Shehata
Panel: Barry Adams

Scope:

  • Core Web Vitals for News Publishers
  • SEO for Paywalled Content
  • International SEO for News Websites
  • Technical SEO Audits for News Sites
  • Migrations
  • New/Advanced implementation of Structured Data
  • Technical SEO for Timeliness & Freshness
  • SGE impact
  • AI in the newsroom

 

Full Webinar:

Webinar Summary

A Deep Dive into News Tech SEO with Barry Adams
 

John Shehata: Welcome, everyone, to this exciting episode of News Dash webinars. Today, we're thrilled to have Barry Adams with us, not only a dear friend but also a beacon of brilliance in the SEO domain, particularly in the nuanced field of News SEO. Barry's journey in SEO precedes even the term itself, tracing back to the late '90s. Besides his pioneering work, Barry is the driving force behind SEO for Google News and the co-founder of Ness. Barry, it's an honor to have you join us today.

Barry Adams: Thank you for such a kind introduction, John. It's a bit overwhelming, to be honest. I'm here with the hope of living up to everyone's expectations and excited to dive into the realm of News SEO.

John Shehata: The anticipation is palpable, and personally, this session holds a special place. It's an opportunity to pose to Barry every question about News SEO that's been on my mind. We've structured today's webinar to first explore these questions, followed by a brief interlude with a presentation from NewsDash. The latter half will be dedicated to your questions, so I encourage you to use the Q&A feature on the right side panel for any queries specifically for Barry.

 

What do you think distinguish news SEO from traditional SEO practices? Why is it important right now to for the longest time I have been trying to establish news SEO as a branch of SEO. And I think the industry has matured enough and we see a lot of news publishers have an higher either news SEOs or SEO editors and stuff like this. What, in your opinion, what distinguish it from like regular SEO?

Barry Adams: News SEO stands out primarily because of the speed requirement. Google needs to quickly crawl, index, and rank news content to keep up with the fast pace of news updates. This urgency forces Google to take shortcuts, especially in crawling and indexing processes, to ensure timely ranking of news articles.

John Shehata: It's interesting how this urgency in the news cycle creates a unique SEO landscape where immediate results are not just desired but necessary. This immediacy, I believe, makes News SEO particularly satisfying.

Barry Adams: A significant distinction between News SEO and traditional SEO lies in the reliance on authority and trust signals. The news domain within Google's search ecosystem demands a higher standard of quality due to the potential impact and visibility of news content. This necessitates a stronger emphasis on establishing a website's credibility and authority to ensure visibility and ranking in news-related search results.

Real-Time SEO and Its Implications

Barry Adams: News SEO can be seen as the closest thing to real-time SEO. The window for an article to get indexed and drive search traffic is incredibly short, often within a few minutes of publication. This dynamic makes the field both challenging and rewarding, as successful optimization can lead to immediate results.

John Shehata: Absolutely, the immediacy with which results can be seen in News SEO is unmatched. Unlike traditional SEO, where the impact of optimizations might take months to become evident, News SEO offers a gratifying experience of witnessing the immediate effects of your efforts.

 

 

So over this two decades or more of SEO and use SEO, what do you think has changed, especially over the last few years?

Barry Adams: Highlighting a pivotal moment in the evolution of News SEO, Barry Adams points out the inception of Google News as a distinct entity with its own crawling and indexing mechanisms. This was a response to the slower processes of Google's primary system, which couldn't accommodate the fast-paced nature of news content. Initially, Google News operated with a manually curated index, ensuring that only verified publishers were included, a practice that underscored the importance of speed and reliability in news dissemination.

As technology progressed, Google's regular crawling and indexing systems enhanced in speed and efficiency, leading to a significant shift around 2018-2019. Google News was integrated into the main search infrastructure, subjecting news publishers to the same algorithmic criteria as all other content types. This merger marked a new era where News SEO became intrinsically linked to the broader landscape of SEO practices.

Barry Adams: With the integration, News SEO has faced increased volatility. The reliance on the same ranking signals as traditional SEO means news sites are directly impacted by Google's broader algorithm updates. This change has intensified the competition and eliminated the "free ride" that news publishers previously enjoyed in search results. The new paradigm demands that news publishers, regardless of their size or prestige, must now work harder to earn their place in Google's search results, emphasizing the importance of trust, authority, and compliance with Google's evolving guidelines.

 

 

Does Scale Provide an Unfair Advantage in News SEO?

Barry Adams: Absolutely, scale does play a significant role in the SEO ecosystem, and this isn't accidental. For Google, the reliability and authority of a news site are crucial. Large sites like the BBC not only have the advantage of a strong brand that users trust and engage with, thereby sending positive signals to Google, but they also invest heavily in quality journalism. This investment helps them accumulate the authority signals that Google values.

Large news publishers, therefore, have a distinct advantage in terms of visibility and ranking on Google's platform. Their resources allow for broader coverage and deeper engagement, which in turn reinforces their authority and trustworthiness in Google's eyes.

This reality underscores a broader principle across industries: more significant resources often lead to better visibility and higher rankings. While this might seem to skew the playing field, it's a reflection of the emphasis on authority and trust that Google places on content, especially in the news domain.

However, this dynamic also brings into focus discussions about market fairness and the role of antitrust laws in ensuring a level playing field. In the context of SEO and digital visibility, the dominance of large players raises important questions about access, competition, and the future of news dissemination in the digital age.

 

 

The transition from manual to automated inclusion in Google News has been a source of frustration for many new sites. The lack of transparency in the review process leaves publishers in the dark about their status. Given your extensive experience, Barry, do you have any recommendations for these sites?

Barry Adams: The move to an automated system has indeed made the inclusion process more opaque, significantly impacting new sites' ability to enter the Google News ecosystem. The key lies in building authoritative signals over time, which Google recognizes and values. For new sites, this process can be slow, often taking years to establish the necessary level of credibility and trustworthiness in Google's eyes.

To navigate this challenging landscape, publishers must focus on generating high-quality content that fosters engagement and shares, contributing to the site's overall authority. Transparency about the publisher's identity and mission, along with robust editorial standards, can also play a crucial role in building trust with both users and Google.

Barry Adams: While the pathway to visibility in Google News and Top Stories may be daunting, Google Discover offers a more immediate opportunity for traffic. By producing engaging content that resonates with readers, sites can gain traction in Discover, which, while distinct from Google News, can significantly contribute to a site's visibility and audience growth.

John Shehata: Reflecting on the challenges faced by news publishers, especially in terms of monetization and the temptation to increase ad load, it's clear that a long-term vision focused on user experience and engagement is crucial. The emphasis on clicks and engagement not only supports visibility in Google's various news-related platforms but also underscores the importance of quality content and a user-friendly site design.

 

 

What are the key elements you now prioritize in your audits to ensure a site is well-optimized for today’s standards?

Barry Adams: It's quite revealing to look back at audits from a few years ago and see that the core elements of technical SEO have remained largely consistent. The focus remains on optimizing for efficient crawling and indexing by Google. This involves ensuring that new articles can be quickly discovered and indexed, minimizing crawl overhead, and presenting clean HTML for Google to process.

The fundamental principles have not shifted dramatically. However, the approach to implementing these principles has become more refined. Proper use of header tags, clean HTML structure, and reducing obstacles for Google’s parsers are as crucial as ever. Moreover, the introduction of features like the max image preview meta tag for Google Discover and more focused news article structured data have become important for enhancing visibility.

Additionally, the significance of Expertise, Authoritativeness, and Trustworthiness (EAT) in the context of technical SEO has grown. Elements like linking author pages from bylines and ensuring your structured data accurately reflects your content are now considered best practices, serving as indirect signals of a site's credibility and relevance.

 

 

What are the most frequent issues you encounter, and what recommendations do you consistently find yourself making?

Barry Adams: Reflecting on numerous audits, it's evident that certain challenges persist within the technical SEO landscape, particularly for news websites. A significant portion of my recommendations revolves around three main areas: load speed, internal linking practices, and pagination.

The issue of load speed and server response time is a recurring theme, affecting a majority of the sites audited. Optimizing for a swift server response is crucial, as Google prioritizes sites that can be crawled quickly and efficiently. For news publishers, especially those located geographically distant from Google's primary crawling locations, employing a Content Delivery Network (CDN) can provide a competitive edge by reducing response times significantly.

Another common mistake involves the use of tracking parameters on internal links. This practice can lead to an inflated number of URLs for Google to crawl, potentially confusing Google's understanding of canonical URLs and skewing analytics data. The recommendation is to avoid these parameters or to use cleaner methods like hash fragments, which Google ignores, preserving the integrity of internal link structure and analytics accuracy.

Pagination practices often fall short of optimal, with implementations that hinder Google's ability to access content beyond the first page. Whether through endless scroll functionalities or poorly designed "read more" buttons, these approaches can obscure significant portions of content from Google's view. The advice here centers on finding a balance that ensures Google can access a sufficient volume of content while avoiding overwhelming the crawler with excessive pagination depths.

 

 

With the plethora of terms like speed, server response, and Core Web Vitals causing confusion among SEOs, what's your take on what truly matters for a website's performance, especially given the hype around Core Web Vitals in recent years?

Barry Adams: Emphasizing the critical role of server response time, Barry Adams identifies it as the most crucial factor for website performance in the context of SEO. This element directly impacts Google's ability to crawl a site efficiently, affecting the site's visibility and ranking. Unlike the debated significance of Core Web Vitals, server response time has a measurable, tangible impact on a website's performance from a technical SEO perspective.

Despite the industry's recent focus on Core Web Vitals as a pivotal metric, Adams expresses skepticism regarding their direct influence as a ranking factor. Instead, he advocates for their utility as indicators of user experience, offering a quantifiable method for assessing and improving how users interact with a website. While not dismissing their relevance, Adams suggests a more measured approach, viewing Core Web Vitals as part of a broader strategy to enhance site performance rather than the sole priority.

Echoing Shehata's observations, Adams agrees that an overemphasis on Core Web Vitals, to the detriment of content quality and site architecture, is misguided. Both experts advocate for a balanced SEO strategy that considers various factors, including server response time and user experience, as measured by Core Web Vitals, without losing sight of the fundamental aspects of content and structural integrity.

 

 

For SEO professionals tasked with conducting in-house audits, particularly in the complex world of news websites, what foundational advice would you offer to guide their process effectively?

Barry Adams: Emphasizing a structured approach, Adams advises starting with identifying and analyzing the website's templates, such as homepages, section pages, topic pages, and various article types. This initial step allows for a focused review of what can be optimized for each template type concerning efficient crawling and rapid indexing by search engines.

Moving beyond the template level, Adams suggests not overwhelming the audit with the entirety of the site's content but instead taking a significant sample for analysis. By crawling a substantial but manageable portion of the site, auditors can gather the necessary data to infer broader insights applicable across the website.

For websites with extensive and diverse content, breaking down the audit into segments, such as focusing on specific sections or themes at a time, can yield more detailed and actionable findings. This approach is particularly beneficial for sites utilizing multiple technology stacks, presenting unique challenges in consistency and optimization.

Adams highlights the value of Google Search Console (GSC) as a tool not just for identifying issues but for guiding SEOs toward potential areas of concern. While GSC may not provide direct solutions, it effectively points to areas requiring deeper investigation, such as issues with redirects, soft 404s, or discrepancies in canonical tags.

 

 

In the realm of technical SEO, with an array of tools at our disposal, which ones do you find indispensable for conducting audits, especially given the depth of analysis required for news websites?

Barry Adams: Emphasizing a balance between tool reliance and manual analysis, Adams highlights his preference for a select few tools that bolster his auditing process:

1. SiteBulb: Praised for its comprehensive technical analysis capabilities and now enhanced by a cloud-based version, SiteBulb stands out for its detailed reports, making it Adams' crawler of choice for delving into website structures and identifying issues.

2. Google Search Console (GSC): Serving as a foundational starting point for audits, GSC offers invaluable data directly from the search engine's perspective, guiding SEOs in identifying and prioritizing areas for improvement.

3. WebPageTest: For in-depth load speed analysis, WebPageTest provides detailed insights into a website's performance, a critical aspect of technical SEO that directly impacts user experience and search rankings.

While acknowledging the utility of these tools in facilitating technical audits, Adams stresses the importance of not becoming overly dependent on them. He advocates for direct engagement with the website being audited, suggesting that a nuanced understanding of the site's content, architecture, and user experience is essential for identifying genuine improvement opportunities. This hands-on approach ensures that audits are not only informed by data but also by a comprehensive understanding of the website's unique context and challenges.

 

 

When we talk about syndication, especially regarding your article about Yahoo impacting publishers, what were your main concerns?

Barry Adams: My concerns revolved around Google's handling of syndicated content. Despite publishers following Google's recommendations, like using canonical tags, the reality is that Google struggles with deduplication. This often results in syndicated content outperforming the original content in search rankings.

John Shehata: Given these challenges, what strategies do you recommend for publishers to mitigate the impact of syndication on their search visibility?

Barry Adams: A few strategies can help. First, delaying the syndication of content so that the original article has time to gain traction. Second, providing slightly altered or less optimized versions of the articles for syndication. And third, negotiating terms with syndication platforms to ensure the original publisher's visibility is prioritized. It's about finding a balance that doesn't compromise the publisher's search traffic and revenue.

John Shehata: How does the issue of syndication reflect on Google's abilities to handle duplicate content, especially in the news sector?

Barry Adams: It highlights a fundamental weakness. Google's deduplication process is limited, particularly in the fast-paced news environment. The process is not just about comparing HTML content but involves understanding the context and authority signals, which Google does at a basic level, often leading to suboptimal outcomes for original publishers.

John Shehata: Looking at the business side, how do publishers navigate the trade-off between immediate syndication revenue and potential search traffic loss?

Barry Adams: Publishers need to make a commercial decision, weighing the potential search traffic against the immediate revenue from syndication. This might involve accepting a lower revenue from delayed syndication if it means retaining a higher volume of direct search traffic, which could be more valuable in the long term.

John Shehata: To wrap up our discussion on syndication, any final thoughts on how publishers can protect their interests?

Barry Adams: Publishers must be proactive in their syndication strategies, ensuring they do not inadvertently give away their competitive edge in search rankings. It involves ongoing negotiations with syndication partners, careful monitoring of search performance, and adapting strategies to safeguard their content's visibility and value.

 

 

For large sites with a lot of old, irrelevant news articles, should these be tidied up and redirected, or is it better to leave them for topical authority and external links?

Barry Adams: Unless content is actively harmful, it should generally not be deleted. Old content can contribute to a site's historical record and topical authority. Pruning should only be considered if the content is not representative of the publisher's current standards, lacks inbound links and traffic, or if hosting costs are prohibitive.

John Shehata: With discussions around pruning content to improve site performance, what's your take on this practice?

Barry Adams: The idea that removing content directly leads to increased traffic is a misconception. The actual benefits of pruning are more nuanced and often relate to specific goals like reducing hosting costs or improving crawl efficiency, rather than a general improvement in site performance.

John Shehata: What are valid reasons for considering content deletion on news websites?

Barry Adams: Valid reasons include removing content that no longer aligns with the publisher's standards, is of poor quality, lacks traffic and links, or to save on hosting costs for an extensive archive. However, these decisions should be made cautiously, considering the potential impact on the site's historical integrity and topical authority.

John Shehata: How does old content impact crawl efficiency and Google's indexing priorities?

Barry Adams: Concerns about crawl efficiency due to old content are often unfounded. Google is adept at prioritizing important pages for crawling, such as the homepage, main section pages, and new articles. Unless there's clear evidence of indexing delays or crawl issues, pruning old content is unlikely to yield significant benefits.

John Shehata: So, the push for content pruning is often based on misconceptions?

Barry Adams: Yes, the drive to delete old content frequently stems from misunderstandings about how search engines prioritize and crawl content. Real issues requiring attention are often different from the perceived problem of having too much old content.

 

 

What are effective AI techniques for enhancing technical SEO performance?

Barry Adams: AI can excel in pattern detection, assisting with large-scale website analyses. An illustrative use case involves merging sites, where AI helped combine content from overlapping pages of merged companies, preserving ranking value and semantic content integrity. This approach streamlined the merging process and automated redirect mapping, significantly reducing manual effort.

John Shehata: How widespread is the use of AI in technical SEO tools, and what improvements do you anticipate?

Barry Adams: While AI's potential in technical SEO is recognized, its practical application remains limited in many tools. AI could significantly aid in identifying patterns and automating complex processes like site mergers. However, Barry emphasizes a cautious approach, noting that AI might not solve all technical SEO issues and advocates for a manual-first strategy to ensure accurate problem-solving.

John Shehata: Can AI replace manual efforts in technical SEO?

Barry Adams: AI can support but not fully replace manual efforts in technical SEO. It offers efficiency in specific tasks, like analyzing large datasets for pattern recognition. Yet, a cautious, manual-first approach is advised to ensure that the root causes of issues are addressed, not just the symptoms identified by AI.

 

 

How do you view the practice of frequently refreshing news articles to maintain prominence in search results, especially given its prevalence in Europe?

Barry Adams: Barry acknowledges the strategy as an exploit of Google's system, where new URLs are interpreted as fresh articles, thus potentially gaining unfair advantage in visibility. While he sees occasional use as acceptable for articles of significant importance, excessive use has already led to negative impacts from algorithm updates. Barry advises caution, suggesting moderation and adherence to best practices over exploiting vulnerabilities, emphasizing that long-term penalties could outweigh short-term gains.

John Shehata: Should publishers in Europe and elsewhere continue the practice of article URL refreshing to stay competitive in top stories?

Barry Adams: Barry recommends restraint, warning against aggressive exploitation of Google's weakness in handling refreshed content. He mentions that Google's disapproval could lead to future punitive actions against publishers who overuse this tactic. His advice is to prioritize integrity and sustainable SEO practices over short-term exploitation, hinting at inevitable repercussions for those who excessively manipulate their content's visibility.

 

 

What's your perspective on the impact of Search Generative Experience (SGE) on news consumption and its broader implications?

Barry Adams: Barry expresses uncertainty about the long-term impact of SGE on search and news consumption. He suggests SGE may not drastically alter news reading habits, as people are likely to continue relying on their preferred news sources. While acknowledging SGE's potential for summarizing news, he notes the current tension between AI developers and publishers regarding content access for AI training. Barry sees a potential for mutually beneficial collaborations between large publishers and AI developers but warns that publishers reliant on evergreen content may face challenges. He concludes that the hype around SGE may be overblown, anticipating it to settle into specific, valuable use cases without completely transforming the digital landscape.

John Shehata: How has your view on SGE evolved, and how do you see it fitting into the future of online content and search?

Barry Adams: Barry admits his initial concerns about SGE potentially revolutionizing the web have moderated. He now views SGE as an evolutionary step that will find its niche without upending established online behaviors or significantly detracting from traditional news consumption. He observes a cautious approach from Google in implementing SGE, suggesting even Google is still exploring SGE's potential and limits. Barry aligns with the view that while SGE may influence certain aspects of online interaction, it is unlikely to be the all-encompassing disruptor some anticipated.

 

 

How should niche news sites structure their categories? Is it better to have broad categories or more specific ones, and how should subcategories be utilized?

Barry Adams: Barry advocates for niche news sites to adopt fairly specific main categories that directly relate to their industry focus, rather than broad, generic ones. He believes that specific categories and the use of industry-relevant terms in top navigation send strong topical authority signals to Google. Barry also touches on the use of tag pages as essentially category pages for topics that don't fit in the main navigation, suggesting a pragmatic approach to their use based on the volume of related content produced. His advice underlines the importance of aligning category and tag strategies with the publisher's content volume and audience expectations, emphasizing specificity to enhance both user experience and search engine visibility.

 

 

Can you discuss SEO metrics that gauge the effectiveness of an editorial team?

Barry Adams: Barry highlights the importance of not solely relying on SEO performance to measure the effectiveness of an editorial team. He advises against judging a journalist's work purely on the number of clicks from Google, as this could lead to undesirable content practices. Instead, he suggests a holistic approach, considering various metrics across different channels. For SEO specifically, the primary metric is the number of clicks from Google, with the expectation that there should be a baseline understanding of what an average article should achieve in terms of clicks, adjusted for the topic's niche or general interest. Barry also distinguishes between clicks from Discover and search, suggesting separate analysis due to their different optimization strategies. He emphasizes the importance of empowering and reminding editors and journalists of best practices in SEO, such as effective use of internal links and keyword placement, to build long-term traffic.

 

 

How can a publisher appearing in Google News tab improve their chances of appearing in Top Stories?

Barry Adams: Barry explains that not appearing in Top Stories, despite being approved in Google Publisher Center and appearing in the Google News tab, likely isn't a technical issue. Key reasons might include the publication's newness or lack of sufficient authority signals. The News tab has a broader scope than Top Stories, allowing for a wider range of content. Barry suggests that to appear in Top Stories, publications need to focus on building authority through high-quality journalism. He also notes that if there's a delay in Google indexing new articles, this could affect visibility in Top Stories. Technical aspects, like the use of client-side JavaScript, might hinder prompt indexing and thus presence in Top Stories.

John Shehata adds that being listed in the News Publisher Center doesn't guarantee visibility in Top Stories. He recommends using Google Search Console to verify if a site receives traffic from Google News as a more accurate measure of inclusion in Google's news ecosystem. He also mentions that the News tab itself draws significantly less user engagement compared to Top Stories and Google News, highlighting the importance of focusing efforts on areas that drive the most traffic.

 

 

What is the future of AMP for news?

Barry Adams: I'll give you one word answer; None. I'll give a bit more background. We all know AMP and we all hate that. Look at AMP is open source. They have a GitHub repository and all you really need to do to understand the future of AMP is look at the amount of contributions.

That are being made to that GitHub repository. It fell off a fucking cliff and it's just dead in the water. It is not being updated anymore. It is a dead standard and does not have a future. So get rid of it. Put effort into improving Core Web Vitals instead. And just forget about that. It's history, man.

 

 

Where does user-generated content (UGC) fit into current news SEO best practices?

Barry Adams: Barry sees value in user-generated comments and content, such as product reviews and experiences, but emphasizes the need for careful moderation. He suggests making UGC non-indexable for Google if tight moderation controls cannot be ensured to maintain content quality. Only when editorial oversight can guarantee quality should UGC be exposed to Google's indexing. Profanity and misinformation in UGC can be negative signals in Google’s ecosystem, making moderation crucial. He notes AI tools might aid in making comment moderation more manageable, yet advises caution in allowing UGC to be indexed by Google.

 

 

Would you recommend building an HTML sitemap in addition to your XML sitemap?

Barry Adams: Barry appreciates well-structured HTML sitemaps for their role in improving crawl efficiency and highlighting the importance of topic pages. He cites examples like the New York Times and notes that some publishers cleverly organize sitemaps by trending topics or alphabetically list all topic pages. HTML sitemaps offer Google clear pathways to important content, varying by how many clicks away they are from the homepage. Barry endorses the use of HTML sitemaps alongside XML sitemaps.

 

 

Is there a tendency in Google Discover to give traffic to smaller niche publishers, and does it work the same way in top stories? Are these ranking algorithms the same?

Barry Adams: The core ranking algorithms used by Google for Discover and Top Stories are fundamentally the same, focusing on quality and relevancy. However, Discover emphasizes personalization and engagement signals more than Top Stories. In Top Stories, Google aims for neutrality with limited personalization, while Discover is highly personalized, showing users content from websites they often engage with. This difference means that Discover might favor smaller niche publishers more if they have positive engagement metrics. Essentially, while the algorithms share a base, the "volume buttons" for certain metrics are adjusted differently between Discover and Top Stories.

 

 

Would you recommend using a subdomain or path for different sections? For example, xx.com/sport/news or sports.xx.com, news.xx.com.

Barry Adams: Always use paths (folders), not subdomains. The decision is almost always clear-cut, favoring paths due to how Google interprets hostnames. When a website is approved for Google News and especially for Top Stories, that approval is tied to the hostname. A subdomain is seen as a potentially separate website, meaning any Google News inclusion needs to be re-earned. Additionally, internal links within the same hostname pass more link value than links to a different hostname, suggesting that paths are more effective for SEO than subdomains. In nearly all cases, use subfolders instead of subdomains to maintain link value flow and to avoid losing Google News inclusion. Exceptions exist, but they require careful mitigation strategies.

 

 

John Shehata: What advice would you give to news SEOs and news organizations for 2024 and beyond?

Barry Adams: The best advice for both news SEOs and news organizations is to focus on building a loyal audience that directly visits your website, not one that depends on third-party technology companies like Google, Apple News, Facebook, or Amazon. Develop a brand strong enough that your audience seeks you out directly, bypassing the need to rely on external platforms for traffic. Engaging directly with your audience, encouraging newsletter sign-ups, and fostering a community can make you less susceptible to algorithm changes and more sustainable in the long term. Building a loyal audience and a recognizable brand is key to creating a resilient and successful news publication.

 

John Shehata: Thank you again so much. That was very informative, and we hope to have the webinar available on YouTube soon.

Barry Adams: Thank you to all the attendees and for the excellent questions. It was truly enjoyable. Goodbye.

 

 

Share this post

Leave a comment

Get ALL Google News and Web Stats for your news site and your competitors today!

The #1 News Rankings analytics

NEWS SEO Newsletter - Subscribe NOW

Our Clients

RECIPIENT OF THE TOP SEO SOFTWARE AWARDS

Best SEO Software - Global Search Awards Best SEO Software - US Search Awards Best SEO Software - UK Search Awards Best SEO Software - European Search Awards Best SEO Software - European Search Awards