Saturday, February 29, 2020

Leadership Chosen for JUUL Cases Consolidated in California State Court

On February 19, Los Angeles Superior Court Judge Ann Jones, who has been selected to preside over the consolidated California state court lawsuits against JUUL Labs, appointed a number of attorneys to lead the litigation against the San Francisco-based manufacturer.   The team of lawyers will oversee the nearly 100 lawsuits that have been filed in state court; the cases allege that the e-cigarette manufacturer has created a vaping epidemic throughout the state among its young people through deceptive marketing practices that sought to mislead consumers into thinking its products are less addictive than traditional cigarettes.  The lawsuits also claim that the epidemic constitutes a public health crisis that endangers students’ academic performance as well as their health. John Fiske of Baron & Budd and Rahul Ravipudi of Panish Shea & Boyle LLP have been appointed to serve as counsel to public entities such as schools, including the Los Angeles and San Diego school districts. Other attorneys who will assume leadership roles include Rick Meadow of The Lanier Law Firm, Adam Pulaski of Pulaski Kherkher and Robert Binstock of Reich & Binstock as members of the Plaintiffs Steering Committee, and Brooks Cutter of Cutter Law, Francois Blandeau of Heninger Garrison Davis, Frederick Darley of Beasley Allen Crow Methvin Portis & Miles PC, Lewis Garrison of Heninger Garrison Davis, Hirlye Lutz of Cory Watson and Jacob Plattenberger of TorHoerman Law. Thomas Girardi of Girardi Keese and Ray Boucher of Boucher LLP will serve as private liaison counsel and Paul Kiesel of Kiesel Law and Mark Robinson will represent private plaintiffs, – those individuals with health claims bringing suits against JUUL. William Levin of Levin Simes Abrams LLP and Dan Robinson of Robinson Calcagnie Inc. will be liaisons to the separate federal MDL also pending in California. Judge Jones indicated in a public hearing on January 28 that she was seeking “common initial discovery” to prepare for upcoming bellwether trials. Click on this link for information on Verus’ Mass Tort Services. To contact us, fill out this form or email us at info@verusllc.com and we will reply immediately. |CONTACT US| |REQUEST PROPOSAL|
leadership-chosen-for-juul-cases-consolidated-in-california-state-court.jpg
Manager of Research Services
609-466-0427 As the Manager of Research Services at Verus, Kim uses both historical documents and current analysis to conduct ongoing research into the products and exposure sites for each of the Trusts administered by Verus. MORE

https://www.forlawfirmsonly.com/leadership-chosen-for-juul-cases-consolidated-in-california-state-court/

Friday, February 28, 2020

Litigation Update: The Latest Developments in J&J and Revlon Talcum Lawsuits

litigation-update-the-latest-developments-in-jj-and-revlon-talcum-lawsuits.png
A month after settling a case at mid-trial in Alameda County, California, J&J put an end to a NY trial on February 25th, where plaintiff Laura Shanahan accused J&J’s baby powder of causing her pleural mesothelioma. The jury was relieved of their duties before they got a chance to hear the case. The amount of the settlement has not been disclosed. According to the testimony, Ms. Shanahan was a lifetime user of J&J’s powder, which she started using in early childhood and transitioned into adulthood. Also in NY on February 25, Revlon, a manufacturer of cosmetic products, was hit with a lawsuit by Maryland resident Laura McDaniel. The plaintiff claims that she was exposed to Revlon’s Jean Nate talcum powder product, purchased for her by her father while he was employed by Revlon in NY. The plaintiff was diagnosed with mesothelioma in January. Ms. McDaniel is seeking $20 million in compensatory damages and $40 million in punitive damages. Other chemical distributors were included in the lawsuit in addition to Revlon for failure to warn of the dangers of asbestos, which companies have known about since as early as the 1930s. Revlon introduced the Jean Nate product in 1935. Click on this link for information on Verus’ Mass Tort Services. To contact us, fill out this form or email us at info@verusllc.com and we will reply immediately. |CONTACT US| |REQUEST PROPOSAL|

https://www.forlawfirmsonly.com/litigation-update-the-latest-developments-in-jj-and-revlon-talcum-lawsuits/

Google’s Response to Moz Article Critical of SERPs via @martinibuster

googles-response-to-moz-article-critical-of-serps-via-martinibuster.png
Google’s Danny Sullivan tweeted a response regarding an article written by Dr. Pete Meyers. The article, published on Moz, was about an increase in search features that push down the traditional ten blue links. Danny raised interesting issues with the article that deserve to be considered.

What is an Organic Listing?

The first point Danny Sullivan discussed was the definition of an organic listing. The article defines an organic listing as the traditional ten blue links that link to a web page. Everything else it described as “organic components” or “technically organic” as a way to single them apart from the ten blue links, which the article regards as organic listings. Danny also tweeted: 
“Your customers probably won’t understand that organic isn’t just web pages if you continue to use organic to mean that. Saying organic listings are “technically” that way or have a “component” — sorry, but it feels like it feeds misunderstandings and confusion.”
He continued:
“My concern is people who don’t take care to read come away with the idea that organic has diminished when there is organic all over the page. It potentially keeps people thinking backward rather than forward.”

Move Forward Not Backwards

I believe that by “people thinking backward” Danny means clinging to the idea that SERPs are ten blue links and ignoring opportunities latent in rich search features. “Thinking forward” may the understanding that featured snippets, videos and so on represent opportunities to rank in a different way and get more traffic. I know for myself that when I search for the name of a song I often look for the green Spotify icon so that I can click that and listen to the song while in the car. That green Spotify icon isn’t a part of the ten blue links but it is immensely useful.

Vague Search Queries

Moz’s example of the “worst-case” is a search for the phrase “lollipop.” The report notes that a user must scroll 2,938 pixels to reach the traditional “blue links” organic listings. But according to Danny, you don’t have to scroll nearly 3,000 pixels for organic listings. There are multiple organic listings at the top of the page. This is what Danny Sullivan tweeted
“…when I read something like “While featured snippets are technically considered organic” or the idea that for “Lollipop” that the first listing isn’t the big video listings at the very top of the page, there seem to be some problematic assumptions…”
Followed by this tweet
“Featured snippets aren’t “technically” organic listings. They are organic listings. And ignoring things listings that appear in Top Stories, businesses in local, programs in college displays feels like a dated assessment of how search works….”
Here’s a screenshot of the search results for the search phrase, lollipop: googles-response-to-moz-article-critical-of-serps-via-martinibuster.gifAs you can see in the above screenshot, Google’s search result satisfies five search intents.
  1. An organic video listing of the song.
  2. Lyrics for the song
  3. Links to music services that offer the song
  4. Link to search results about the song
  5. Link to search results about lollipop the candy.

Search and Search Intents

Satisfying the search intent for a one-word search phrase is difficult because there is likely going to be multiple search intents. Google has to identify the most popular intent. In this case it appears to be the song, Lollipop. Then Google must satisfy the related and alternate search intents (lyrics, listen on a music service, band information and lollipop the candy). If you look at the screenshot, it’s evident that Google successfully satisfies five search intents for that one-word keyword phrase. Search isn’t about linking to websites. That’s the means to the ends. The ends in search is about satisfying search intents. Sometimes that means a link to Spotify. Sometimes users are satisfied by a link to a video.

An Alternate Look:

The following are my thoughts about the article.  They’re not meant to be criticisms. They are just thoughts that occurred to me as I read the article.

1. Keywords in the Article are Vague

Basing a study on keywords with vague search intent literally guarantees that the search results will show features like People Also Ask, local business listings, videos, links to music services and so on. As was pointed out, ten blue links are not as useful for satisfying multiple search intents for vague queries.

2. Keyword Examples in Article are Not Head Terms

This is Moz’s stated methodology:
“While the keywords in this data set are distributed across a wide range of topics and industries, the set skews toward more competitive “head” terms. “
Judging by the keyword phrases used in the article as examples, the keywords used in the study are short phrases but are not necessarily head terms. Head terms are phrases that have a large search volume. What constitutes a head term is defined entirely by the search volume, how often a query is searched. Moz appears to apply the label “head term” to search phrases that are short but are not necessarily popular. This is a common issue with how head terms are considered. It is assumed that short phrases of one or two words have a high search volume. The definition of a head term has nothing to do with how many words are in the search query. It’s 100% about search volume. Because people are using more conversational search queries, it could very well be that the vague queries in Moz’s study are not head terms but simply vague terms, which will naturally skew the results toward SERPs with features designed to solve for multi search intent.

Google Trends Evidence

I checked to see if the Moz article search queries were indeed head terms. I compared two of Moz’s search phrases, lollipop and vacuum cleaners in Google Trends against a known popular phrase, iPhone Case. googles-response-to-moz-article-critical-of-serps-via-martinibuster-1.png As you can see in the Google Trends graph above, two of the search queries from the Moz article have relatively low search volume compared to the popular phrase iPhone case. Moz’s phrases are short and vague and contain multiple search intents. They are arguably not head terms because by definition a head term has a high search volume. By contrast, the search query “iPhone case” is a true head term. Below is a screenshot of a Google search results page for that head term: googles-response-to-moz-article-critical-of-serps-via-martinibuster-2.png As can be seen in the above screenshot, Google shows ads followed by the ten blue links. The reason Google is showing the ten blue links is presumably becaues the search phrase is unambiguous. Some may point to Google’s search features like local boxes, videos and carousels as if those features are a bad thing because they push down the ten blue links. But the reason Google shows features is to satisfy search intents, to meet the needs of the user. My suggestion is that perhaps these search features that supposedly make the search results “worse” serve a purpose and can also result in search traffic.

3. People Use Conversational Search

Google users are increasingly making search queries that are highly personalized.   Users are increasingly making conversational searches. Conversational search uses more keywords. Because conversational search queries contain multiple words the Moz study can arguably be said to not be representational of the state of Google search results, because the methodology is “skewed” to short phrases. Going by the examples provided by the Moz article, it appears that the research uses short and vague queries. This results in a skewed outcome dominated by SERPs with multiple search features designed to help users with a diverse set of search intents. It could be argued that an even-handed study would include conversational search.

Are Blue Links More Useful?

It may arguably be unreasonable to assert that search results comprised of ten blue links is the best way to present a complex search result for a vague query consisting of multiple search intents. The Moz article presumes that the ten blue links are the listings that matter and that search features get in the way. This is implied from the very first sentence:
“Being #1 on Google isn’t what it used to be.”
Moz’s definition of #1 is in the context of the ten blue links. The Moz article goes on to say:
“The worst case scenario, a search for “Disney stock,” pushed #1 all the way down to 976px.”
The assumption  is that the blue links are important and that everything else that gets in the way of those blue links make the search results “worse.” The Moz article states:
“It feels like the plight of #1 is only getting worse.”
Danny Sullivan challenged that view with this tweet:
“Search is about serving info; sometimes a web page isn’t the best source. Providing refinement options helps users narrow to better info, which helps sites….”
The purpose of the different features is to provide answers for queries that have multiple search intents in a manner that is easily navigated to. That’s useful. The article itself acknowledges the usefulness of search features at the very end:
“…many rich features are really the evolution of vertical results, like news, videos, and images, that still have an organic component. In other words, these are results that we can potentially create content for and rank in, even if they’re not the ten blue links we traditionally think of as organic search.”
The article notes that there are organic results in the various search features. It also acknowledges there are opportunities in those search features. So it’s kind of puzzling that the article spends so much time making the case that search features pushing down the ten blue links makes the search results “worse.” Read the article and decide for yourself: How Low Can #1 Go? (2020 Edition)

https://www.businesscreatorplus.com/googles-response-to-moz-article-critical-of-serps-via-martinibuster/

Thursday, February 27, 2020

Litigation Update: $1.6B Opioid Global Settlement Reached with Drug Manufacturer

litigation-update-1-6b-opioid-global-settlement-reached-with-drug-manufacturer.png
A proposed settlement has been reached with generic drug manufacturer Mallinckrodt Pharmaceuticals, by which the company will pay $1.6 billion to settle thousands of lawsuits it faces in the Cleveland MDL as well as 47 state and U.S. territory attorneys general as part of their role in the rampant opioid crisis. The settlement will put Mallinckrodt’s generic units (Mallinckrodt LLC, SpecGx LLC and other affiliates) in Chapter 11 while leaving the specialty drug business out of restructuring. The proposed settlement will be paid out to a trust in the course of 8 years: $300M to be paid after Mallinckrodt’s generic units come out of Chapter 11; $200M to be paid to plaintiffs for the next two years, and $150M to be paid each year after that. The trust would also receive options to purchase up to 20% of the company’s generic units shares after the units exit from bankruptcy. The funds in the trust will also be used to provide addiction treatment programs and related initiatives around the country. Mallinckrodt was the largest generic opioid manufacturer in the U.S. during the opioid epidemic. Click on this link for information on Verus’ Mass Tort Services. To contact us, fill out this form or email us at info@verusllc.com and we will reply immediately. |CONTACT US| |REQUEST PROPOSAL|

https://www.forlawfirmsonly.com/litigation-update-1-6b-opioid-global-settlement-reached-with-drug-manufacturer/

How to Identify a Possible Negative SEO Campaign via @AdamHeitzman

how-to-identify-a-possible-negative-seo-campaign-via-adamheitzman.jpg
The first step in fighting a negative SEO campaign is just knowing that you’ve been hit with one. Once you’ve identified that you’re the victim of a negative campaign, you can take steps to remedy the situation; it doesn’t have to be a death sentence for your business. That being said, identifying a negative campaign may be harder than it seems. This article will help you diagnose whether  your website has been affected by negative SEO so that you can spend more time on the bigger picture: fixing the problem.

What Is Negative SEO?

Negative SEO is defined as any action taken with the intent to negatively impact the search engine rankings of a specific URL. Google claims that they have safeguards in place to identify and stop negative SEO campaigns, and thus they don’t really take responsibility for any negative tactics that might get around their safety net. Instead, they encourage negative SEO victims to contact the webmaster of the site in question in order to resolve the issue. Here is Google’s official statement on negative SEO. Unfortunately, if someone really is trying to harm your site, chances are they’re not going to own up to it, or least of all change their practice, just because you contact them. It’s worth a shot, but there are more effective steps to take.

How to Identify a Negative Campaign

1. Has There Been a Sudden Drop in Your Search Traffic?

If the answer is yes and you haven’t made any significant changes yourself, this could be a red flag. Make sure you check the analytics of your site regularly so you can quickly identify a drop in search traffic before too much harm is done.

2. Has Google Sent You a Message Saying That You’re Taking Actions Against Their Guidelines?

This is a pretty straightforward way to see if someone is using negative SEO tactics against you. If you get an error message from Google that looks like the following image, and you’re not the one making the errors, then probably someone has it in for you. The following image specifies a penalty against links, but the error message you receive could be for any number of different manual penalties. The message box in Google Search Console is one of the first places you’ll want to check if you think there’s something wrong with your site. If it’s been hacked, Google wants you to know. how-to-identify-a-possible-negative-seo-campaign-via-adamheitzman.pnghow-to-identify-a-possible-negative-seo-campaign-via-adamheitzman.png

3. Has There Been a Drop in Your Individual Keyword Rankings?

Again, keeping track of your keyword rankings regularly is just good SEO practice, but it especially helps in case of a negative attack. If one day you notice that one or more of your keywords have dropped significantly in rankings, there’s probably something wrong. Hopefully you’re using a reliable rank tracking tool to keep track of your positioning so it will be easy to identify a major change.

4. Has There Been a Spike in Your Backlinks?

You should be monitoring your links the same way you monitor your keywords, and if you notice a spike in your backlinks over time, or a significant change in their quality, this could be a sign of a negative SEO campaign. Using a tool that allows you to monitor a backlink campaign and will even email you if they notice a change in your links is important, so you should be able to identify any problems quickly and easily. If you receive a notification saying that you’ve all of sudden received hundreds of new links, you definitely need to take action before those links damage your site.

5. Be Aware of Blocked Links Not Showing

There will be cases where tools won’t be able to help. Some websites block tools like Moz and Ahrefs from crawling their website. If you’re only monitoring these types of resources you could still be neglecting hazardous links. This is where Google Search Console and even Google Analytics can be helpful. In Google Search Console, you will find a backlink report providing all of the links to your website that Google has discovered. You will have to manually check links that look suspicious as you won’t get as much information as other tools. With Google Analytics, you can check the referral traffic report to see what sites might be driving users to your website. You can flag anything that seems out of the ordinary and manually check it to see if it’s possible from a negative SEO campaign against your website.

6. Has There Been a Decrease in Your Backlinks?

Also, pay attention to any drops in your backlink profile. Fluctuation is normal when it comes to links, but if you notice some valuable links that you’ve earned are dropping off it’s very important to investigate and find out why. Negative SEO includes asking for links to be removed. If a link has been recently removed, quickly reach out to that website politely asking why. If it’s because they were asked to remove it, you can now address the situation. You can also be proactive letting other websites that you have a relationship with know what is going so they can ignore such requests.

7. Is Your Content Showing up Everywhere?

Content scraping is taking your original content, word for word, and publishing on other websites. It’s an easy method for anyone wanting to attack you with negative SEO. There are many benefits to syndicating content on high authority and relevant sites, if done correctly. Google provides guidelines on how to make sure you aren’t hurting your own website. They recommend having a link back to the original article. You can also ask for the website hosting your content to add noindex at the page level. Why is this important? Google warns they will choose which version to rank and provide to users in search results, which may or may not be your original content. So syndicate carefully. That’s why as a negative SEO tactic, this can be very effective. Plastering your content all over the web will cause duplicate content and create the opportunity for other websites to outrank you. If you notice a drop to a specific page on your website, using a tool like Copyscape will help identify any duplicates of your content that may be living on other websites.

8. Perform a Basic Audit of Your Site

If you haven’t noticed any major changes in any of the areas listed above (or even if you have) but you just have a feeling that something isn’t right, perform a basic audit of your site to see how it appears in Google. You can do this by doing a site: search in Google (e.g., site:example.com). For example, for Target they would type site:target.com. Google should then return with a list of pages from your domain. Use your best judgment to evaluate those pages and look for anything that seems off.
  • Do you recognize all the pages listed?
  • Are there any pages missing?
  • Have any of your pages been demoted in their rankings?
These are all helpful questions to be asking yourself. If the answer to any of them is “yes”, then you could be the victim of a negative campaign. how-to-identify-a-possible-negative-seo-campaign-via-adamheitzman-1.png

Conclusion

Knowing you’ve been hit by negative SEO is the first step in remedying the problem. A multitude of tools are available to help you both identify and solve negative SEO campaigns. There’s no need to panic. Be diligent in your website monitoring and security and you should have no problems (save some extra time spent and maybe a few headaches) dealing with negative penalties. More Resources:
Image Credits Featured Image: DepositPhotos.com All screenshots taken by author, February 2020

https://www.businesscreatorplus.com/how-to-identify-a-possible-negative-seo-campaign-via-adamheitzman/

Google mobile-first indexing to be applied to all sites within a year

By Barry Schwartz Google is sending out notices via Google Search Console with “mobile-first indexing issues detected” alerts. In those emails it communicates the issues Google has when it comes to moving that site over to mobile-first indexing. It also says “Google expects to apply mobile-first indexing to all websites in the next six to twelve months.” Mobile-first indexing. Google first introduced mobile-first indexing back in November 2016 and by December 2018 half of all sites in Google’s search results were from mobile-first indexing. Mobile-first indexing simply means that Google will crawl your site from the eyes of a mobile-browser and use that mobile version for indexing and ranking. All in on mobile-first indexing. Google is sending out notices now and those notices say “Google expects to apply mobile-first indexing to all websites in the next six to twelve months.” Here is a screen shot of this “mobile-first indexing issues detected” email from @KyleW_Sutton. google-mobile-first-indexing-to-be-applied-to-all-sites-within-a-year.pngThe notices. Clearly Google is trying to be proactive and notify sites that are not yet moved over to mobile-first indexing with specific advice on what those sites need to do to become mobile-first indexing ready. In this specific case, Kyle said the issue was with “”Missing image” is listed as an error, while “Page quality issue” and “Video issues” are warnings.” Why we care. If you get one of these notices it is likely a notice you should read and take action on. If Google has issues with accessing your site with the mobile-crawler, then it might impact your indexing and ranking of your web pages in Google.
 

About The Author

google-mobile-first-indexing-to-be-applied-to-all-sites-within-a-year.jpg
Barry Schwartz a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry's personal blog is named Cartoon Barry and he can be followed on Twitter here.

https://www.businesscreatorplus.com/google-mobile-first-indexing-to-be-applied-to-all-sites-within-a-year/

Wednesday, February 26, 2020

Google Explains Gradual Declines in Ranking via @martinibuster

google-explains-gradual-declines-in-ranking-via-martinibuster.jpg
In a Webmaster Hangout Google’s John Mueller answered why a site might be gradually losing traffic. Once you know the reasons you’ll be better positioned to take action to reverse a traffic decline.

Links and Content Not Always to Blame

The publisher asked if their backlinks (50% from a single subdomain) and content (auto-generated) were to blame for the gradual drop in rankings (emphasis on gradual month to month gradual drop). Mueller’s answer:
“I don’t think that any of your changes that you’re seeing are related to this content and those particular links. But it’s probably… a more general thing. In particular when you’re seeing kind of a gradual drop over a longer period of time then that to me points at kind of natural ranking changes where things in the ecosystem have changed, things in the algorithm are slightly changing, maybe users are searching in different ways or expecting different kinds of content in the search results. And that generally wouldn’t be a sign that there’s this one thing that you’re doing wrong which kind of made everything blow up if you’re seeing these kinds of granular step by step changes over a longer period of time.”
According to John Mueller’s answer, there are at least five issues that can cause a site to gradually lose rankings. google-explains-gradual-declines-in-ranking-via-martinibuster.gifgoogle-explains-gradual-declines-in-ranking-via-martinibuster.gif

Five Reasons Why Sites Gradually Lose Rankings

  1. Ecosystem changes.
  2. Algorithm has changed
  3. Users searches have changed
  4. User content expectations change
  5. Gradual changes do not come from large dramatic website issues
Related: How to Analyze the Cause of a Ranking Crash

Ecosystem Changes

John Mueller did not elaborate on what “Ecosystem Changes” meant. However it’s clear that it’s something outside of the site that is losing traffic. It’s uncertain what Mueller meant, but I would speculate that it could be something like Link Rot. Link rot is the constant and natural disappearance of links. Links disappear when sites go offline or web pages are updated or removed.  It’s rare for a site to remain static and the same is true for links. Another example of an ecosystem change is an increase in competition. The effect of a competitor improving their promotional activity is their sites will rise in rankings. And that means someone else’s site is going to lose rankings. Ranking a site is a process that is always in motion. You’re either moving forward or falling behind. There is no standing still.

Algorithm Changes

This is something that has been a huge change to how sites are ranked, particularly since the August 2018 update. August 2018 is when Google appears to have increased focus on classifying web sites and improving how it understands search queries and web pages. Every major update since 2018 up to BERT seems to be about understanding what users mean when they search and also about understanding if web pages solve those questions. These algorithm updates have redefined what it means for a web page to be relevant to a search query.

Take a Second Look

John Mueller wrapped up his answer by encouraging the site owner to take a second look at the site and try to identify things that could be improved. This is his advice:
“So, that to me would be something where I try to take a step back and try to take a look at the website in general overall and find maybe areas where you can make significant improvements to kind of like turn the tide around a little bit and make sure that your site becomes more relevant or becomes significantly more relevant for the kinds of users you’re trying to target.”
Related: 20 Reasons Why Your Search Ranking & Traffic Might Drop

Takeaway: Keep an Open Mind

For most anything, there will always be that one thing that stands out as an explanation for something. But just because something is obvious does not mean that it’s the explanation. It simply stands out and is more noticeable. When diagnosing what is going wrong with a site, it’s important to keep an open mind. Don’t assume that the obvious thing that stands out is the reason. Keep looking because in the case of a gradual traffic decline, there are at least five reasons to explain why this is happening.  

https://www.businesscreatorplus.com/google-explains-gradual-declines-in-ranking-via-martinibuster/

Tuesday, February 25, 2020

Google Patent May Explain How Sites are Ranked via @martinibuster

google-patent-may-explain-how-sites-are-ranked-via-martinibuster.png
Bill Slawski wrote about a Google patent that seems to explain what happened in the poorly named Medic update. Bill said that the scope is wider than just medical sites. The patent may show why some sites can’t rank.

Caveat About Patents

It’s important to note that Google does not often confirm whether an algorithm described in a patent is in use.  This patent may or may not be used in Google’s algorithm.

What is Google’s Patent About?

The patent describes a way to classify search queries and websites by topic. Websites are classified by topics. Additionally, search queries are classified by topics.

Knowledge Domains = Topics

In this patent, the algorithm is working with what it calls Knowledge Domains which represent topics. Search queries and web pages can be said to belong to specific knowledge domains. This is how Bill describes the knowledge domains:
“The words “knowledge domain” stands for topics that a query may be about, and is not a reference to a knowledge graph.”
And in his article he states:
“Queries from specific knowledge domains (covering specific topics) might return results using sites that are classified as being from the same Knowledge Domain.”

Topic Pages

A way to simplify this concept is to think of topic buckets. In a topic bucket pages about medical information go in one bucket, pages about natural health go into another bucket, pages about cell phone reviews into a different bucket and pages about personal injury lawyers in a specific city might go into another bucket and so on.

Topic Queries

According to the patent, search queries can also be recognized as belonging to their own buckets as well. So when someone searches for “what is diabetes” Google understands this search query to be a medical question and not a natural healing question. google-patent-may-explain-how-sites-are-ranked-via-martinibuster.gifgoogle-patent-may-explain-how-sites-are-ranked-via-martinibuster.gif

Google Patent Describes Classifying Sites and Queries

This is how the patent described this classification system:

Classifies websites

“The search engine… may use data from a website classification system… to generate search results. For instance, the website classification system… may generate representations for each of multiple websites… and use the representations to determine a classification for each of the multiple websites…”

Classifies search queries

“The search engine… may use a classification for a search query to select a category of websites with the same, or a similar, classification. The search engine… may determine search results from the selected category of websites.”

Sites Organized into Clusters

The patent describes a process that organizes websites by classifying them.
“…the systems and methods described in this document may improve search results pages generated by a search system by including identification of only websites with a particular classification…”
The classification system could create clusters based on the likelihood that a website would contain the answer to a query:
“The website classification system… may determine the classifications based on a likely responsiveness for the websites in the corresponding cluster. For example, the websites in the first cluster may have a higher likelihood of being responsive to queries in the particular knowledge domain than websites in the second cluster.”
Then it describes scenarios where a site might be skipped and not classified. What I find interesting is that it mentions skipping analysis because the cluster that a site is in is distant from known clusters of sites about a topic.
“In some implementations, one or more of the websites used during training may not be assigned to a classification. For instance, when a website representation is more than a threshold distance from a cluster, or is otherwise not included in a cluster, the website classification system… may determine to skip using the website representation to create a composite representation, e.g., may determine to skip further analysis for the website during training.”

Authoritativeness is a Classification

“…each website in the plurality of websites may have a score. The score may indicate a classification of the website, such as an authoritativeness, a responsiveness for a particular knowledge domain, another property of the website, or a combination of two or more of these.”

The Patent is About More than Medical Sites

What is important to understand is that the processes described in this patent apply to a wide range of niche topics. This is not a Medical Algorithm. It is far more than just a medical related patent. According to Bill:
“The patent focused on more than just medical sites. It categorized by industry with health just being one of those. It later sorted by quality scores. The patent provided an example specifically for medical sites… But it made it clear that it involves multiple industries. The queries were classified based on knowledge domains also.”

Takeaway: Implications for Ranking

The part about clustering is intriguing because it mentions features like authoritativeness and distances from other clusters of sites. One measure of authority is links.  And it just so happens there is much research into algorithms that sort websites according to topics. The algorithms choose seed sites that represent the most authoritative site in a particular topic classification. Other sites are then scored according to how distant they are from the seed sites. This algorithm employs a similar system in which a site that is distant from other clusters will essentially be discarded and not considered for ranking. There is no mention of links in the context of using them as a measure of authority.  But the similarities between link distance ranking algorithms that classify sites according to topics and creates clusters of sites based on topics is kind of a mirror to how this algorithm does similar clustering with content topics. It may not be unreasonable to speculate that this reinforces the commonly held belief (and make it more urgent) that links from relevant pages may improve rankings.

Takeaway: Google Update Recovery

These insights into Google’s algorithm validate my suggestions about Google update recovery in general and recovering from the so-called Medic Update in particular.
“The so-called “Medic” update appeared to be clearly about relevance issues, not about author bios or “expertise.””
Perhaps one of the key insights from this patent is that it may be helpful to look at ranking issues from the perspective of relevance. In my experience consulting for sites that have lost rankings, if your site’s rankings have suffered a catastrophic collapse then that may be partially related to something similar to what is described in this patent. If your site dropped a few positions across the board, then that may be other issues like increased competition or relevance. Read: Link Distance Ranking How to Recover from a Google Update Website Representation Vectors Google patent: Website Representation Vector to Generate Search Results and Classify Website

https://www.businesscreatorplus.com/google-patent-may-explain-how-sites-are-ranked-via-martinibuster/

Saturday, February 22, 2020

New local SERP live in Europe

By Greg Sterling In April 2019, Google was experimenting with a new local SERP that highlighted alternative directory sources for the same query. At the time, we saw an example in the wild for Germany. Now, an updated version of the SERP featuring branded directory buttons appears to be live in the UK, Belgium, Spain, Greece, and France – if not already throughout Europe. A more prominent directory box. Below is an example screenshot from a UK search, showing directory links above the map and local pack.

SERP showing results for ‘asbestos removal Halifax UK

new-local-serp-live-in-europe.png
This change in the SERP grows out of Google’s continuing effort to comply with the European Commission’s antitrust decision in shopping search. It’s also an attempt by the company to preempt a separate antitrust action in local search. Yelp previously criticized these types of screens as a return to Google’s “rival links” remedy, which was originally proposed in 2013 and ultimately rejected by the European Commission.

UK SERP showing a local carousel above the map

new-local-serp-live-in-europe-1.png
How are the directories selected? One obvious and immediate question is how are the displayed directories chosen? This isn’t an ad unit, in contrast to the solution implemented in shopping search. In the latter context, comparison shopping engines and Google Shopping bid against one another for placement in PLAs. However, there’s no comparable “sponsored” or “ad” label in the directory box or carousel above. We must assume that Google is algorithmically choosing the directories to display. In the UK example above, clicking on the directory box links takes users to a category page in the case of Yell but a business profile page in the case of Cylex. Other searches (e.g., “dentists, London”) show a carousel with multiple, alternative directories. In some cases, the directories appear on the first page of the organic results, below the map. In other cases, they do not. Why we care. It remains to be seen whether this approach is acceptable to the European Commission. Part of that will depend on whether the buttons drive meaningful traffic to these publishers. If so it could revive the fortunes of at least some of them (think “barnacle SEO”), which have continued to see declining traffic as Google My Business and zero-click search grab more user focus and engagement.
 

About The Author

new-local-serp-live-in-europe.jpg
Greg Sterling is a Contributing Editor to Search Engine Land, a member of the programming team for SMX events and the VP, Market Insights at Uberall.

https://www.businesscreatorplus.com/new-local-serp-live-in-europe/

Friday, February 21, 2020

Google to Highlight Image Licensing Information in Image Search Results via @MattGSouthern

google-to-highlight-image-licensing-information-in-image-search-results-via-mattgsouthern.jpg
Google is beta testing a way for sites to highlight licensing information about content that appears in image search results. The changes to image search results include a small badge indicating a particular image is “licensable.” After clicking on the image to expand it, Google will show where the image can be licensed from. google-to-highlight-image-licensing-information-in-image-search-results-via-mattgsouthern.pnggoogle-to-highlight-image-licensing-information-in-image-search-results-via-mattgsouthern.pnggoogle-to-highlight-image-licensing-information-in-image-search-results-via-mattgsouthern-1.pnggoogle-to-highlight-image-licensing-information-in-image-search-results-via-mattgsouthern-1.png
  1. A URL to a page that describes the license governing an image’s use.
  2. A URL to a page that describes where the user can find information on how to license that image.
These changes aren’t live yet, but sites can begin preparing their content for it ahead of time and ensure their images are eligible to display licensing information. In order for content to display licensing information in Google Images, the site must be utilizing image license metadata. Metadata should be added to each licensable image on a site. Site owners can add the required metadata either with structured data or IPTC photo metadata. Further instructions, and exact code snippets, can be found in Google’s official developer document.

No Impact on Search Rankings

Google’s John Mueller immediately had to clarify that these changes have no impact on search rankings. Google’s Danny Sullivan denied that this change has anything to do with a deceitful form of link building where people try to gain links by falsely claiming ownership of an image. See: Google’s John Mueller to Investigate Deceitful Link Building Practices Use of image license metadata is strictly optional. There was no date provided for when Google will start displaying licensing information in image search results.

https://www.businesscreatorplus.com/google-to-highlight-image-licensing-information-in-image-search-results-via-mattgsouthern/

Monday, February 17, 2020

Google My Business: FAQ for Multiple Businesses at the Same Address

google-my-business-faq-for-multiple-businesses-at-the-same-address.jpg How should I get listed in Google My Business if I’ve got multiple businesses at the same address? How many listings am I eligible for if I’m legitimately running more than one business at my location? What determines eligibility, and what penalties might I incur if I make a mistake? How should I name my businesses at the same address? The FAQs surrounding this single, big topic fill local SEO forums across the web, year after year. The guidelines for representing your business on Google contain most of the answers you’re seeking about co-located businesses, but sometimes they can err on the side of too little detail, leading to confusion. Today, Iet's quickly tackle the commonest FAQs that local business owners and marketers raise related to this scenario, and if you have further questions, please ask in the comments!

Q: I have more than one business at the same address. Can I have more than one Google My Business listing?

A: If you are legitimately operating multiple, legally distinct businesses, you can typically create a Google My Business listing for each of them. It’s not at all uncommon for more than one business to be located at a shared address. However, keep reading for further details and provisos.

Q: How do I know if my multiple businesses at the same address are distinct enough to be eligible for separate Google My Business listings?

A: If each brick-and-mortar business you operate is separately registered with appropriate state and federal agencies, has a unique TAX ID with which you file separate taxes, meets face-to-face with customers, and has a unique phone number, then it’s typically eligible for a distinct GMB listing. However, keep reading for more information.

Q: Can service area businesses list multiple businesses at the same address?

A: Google has historically treated SABs differently than brick-and-mortar businesses. While no official guideline forbids listing multiple SABs — like plumbers and locksmiths — at the same location, it’s not considered an industry best practice to do so. Google appears to be more active in issuing hard suspensions to SABs in this scenario, even if the businesses are legitimate and distinct. Because of this, it’s better strategy not to co-locate SABs.

Q: What would make me ineligible for more than one Google My Business listing at the same address?

A: If your businesses aren’t registered as legally distinct entities or if you lack unique phone numbers for them, you are ineligible to list them separately. Also, if your businesses are simply representative of different product lines or services you offer under the umbrella of a single business — like a handyman who repairs both water heaters and air conditioners — they aren’t eligible for separate listings. Additionally, do not list multiple businesses at PO boxes, virtual offices, mailboxes at remote locations, or at locations you don’t have the authority to represent.

Q: Will I be penalized if I list multiple ineligible businesses at the same address?

A: Yes, you could be. Google could issue a hard suspension on one or more of your ineligible listings at any time. A hard suspension means that Google has removed your listing and its associated reviews.

Q: Will suite numbers help me convince Google I actually have two locations so that I can have more than one GMB listing?

A: No. Google doesn’t pay attention to suite numbers, whether legitimate or created fictitiously. Don’t waste time attempting to make a single location appear like multiple locations by assigning different suite numbers to the entities in hopes of qualifying for multiple listings.

Q: Can I list my business at a co-working space, even though there are multiple businesses at the same address?

A: If your business has a unique, direct phone number answered by you and you are staffing the co-working space with your own staff at your listed hours, yes, you are typically eligible for a Google My Business listing. However, if any of the other businesses at the location share your categories or are competing for the same search terms, it is likely that you or your competitors will be filtered out of Google’s mapping product due to the shared elements.

Q: How many GMB listings can I have if there are multiple seasonal businesses at my address?

A: If your property hosts an organic fruit stand in summer and a Christmas tree farm in the winter, you need to closely follow Google’s requirements for seasonal businesses. In order for each entity to qualify for a listing, it must have year-round signage and set and then remove its GMB hours at the opening and closing of its season. Each entity should have a distinct name, phone number and Google categories.

Q: How should I name my multiple businesses at the same address?

A: To decrease the risk of filtering or penalties, co-located businesses must pay meticulous attention to allowed naming conventions. Questions surrounding this typically fall into five categories:
  1. If one business is contained inside another, as in the case of a McDonald’s inside a Walmart, the Google My Business names should be “McDonald’s” and “Walmart” not “McDonalds in Walmart”.
  2. If co-located brands like a Taco Bell and a Dunkin’ Donuts share the same location, they should not combine their brand names for the listing. They should either create a single listing with just one of the brand names, or, if the brands operate independently, a unique listing for each separate brand.
  3. If multiple listings actually reflect eligible departments within a business — like the sales and parts departments of a Chevrolet dealership — then it’s correct to name the listings Chevrolet Sales Department and Chevrolet Parts Department. No penalties should result from the shared branding elements, so long as the different departments have some distinct words in their names, distinct phone numbers and distinct GMB categories.
  4. If a brand sells another brand’s products — like Big-O selling Firestone Tires — don’t include the branding of the product being sold in the GMB business name. However, Google stipulates that if the business location is an authorized and fully dedicated seller of the branded product or service (sometimes known as a "franchisee"), you may use the underlying brand name when creating the listing, such as "TCC Verizon Wireless Premium Retailer.”
  5. If an owner is starting out with several new businesses at the same location, it would be a best practice to keep their names distinct. For example, a person operating a pottery studio and a pet grooming station out of the same building can lessen the chance of filters, penalties, and other problems by avoiding naming conventions like “Rainbow Pottery” and “Rainbow Pet Grooming” at the same location.

Q: Can I create separate listings for classes, meetings, or events that share a location?

A: Unfortunately the guidelines on this topic lack definition. Google says not to create such listings for any location you don’t own or have the authority to represent. But even if you do own the building, the guidelines can lead to confusion. For example, a college can create separate listings for different departments on campus, but should not create a listing for every class being offered, even if the owners of the college do have authority to represent it. Another example would be a yoga instructor who teaches at three different locations. If the building owners give them permission to list themselves at the locations, along with other instructors, the guidelines appear to permit creating multiple listings of this kind. However, such activity could end up being perceived as spam, could be filtered out because of shared elements with other yoga classes at a location, and could end up competing with the building’s own listing. Because the guidelines are not terribly clear, there is some leeway in this regard. Use your discretion in creating such listings and view them as experimental in case Google should remove them at some point.

Q: How do I set GMB hours for co-located business features that serve different functions?

A: A limited number of business models have to worry about this issue of having two sets of hours for specific features of a business that exist on the same premises but serve unique purposes. For example, a gas station can have a convenience market that is open 6 AM to 10 PM, but pumps that operate 24 hours a day. Google sums up the shortlist for such scenarios this way, which I’ll quote verbatim:
  • Banks: Use lobby hours if possible. Otherwise, use drive-through hours. An ATM attached to a bank can use its own separate listing with its own, different hours.
  • Car dealerships: Use car sales hours. If hours for new car sales and pre-owned car sales differ, use the new sales hours.
  • Gas stations: Use the hours for your gas pumps.
  • Restaurants: Use the hours when diners can sit down and dine in your restaurant. Otherwise, use takeout hours. If neither of those is possible, use drive-through hours, or, as a last resort, delivery hours.
  • Storage facilities: Use office hours. Otherwise, use front gate hours.

Q: Could the details of my Google listing get mixed up with another business at my location?

A: Not long ago, local SEO blogs frequently documented cases of listing “conflation”. Details like similar or shared names, addresses or phone numbers could cause Google to merge two listings together, resulting in strange outcomes like the reviews for one company appearing on the listing of another. This buggy mayhem, thankfully, has died down to the extent that I haven’t seen a report of listing conflation in some years. However, it’s good to remember that errors like these made it clear that each business you operate should always have its own phone number, naming should be as unique as possible, and categories should always be carefully evaluated.

Q: Why is only one of my multiple businesses at the same location ranking in Google’s local results?

A: The commonest cause of this is that Google is filtering out all but one of your businesses from ranking because of listing element similarity. If you attempt to create multiple listings for businesses that share Google categories or are competing for the same keyword phrases at the same address, Google’s filters will typically make all but one of the entities invisible at the automatic zoom level of their mapping product. For this reason, creating multiple GMB listings for businesses that share categories or industries is not a best practice and should be avoided.

Q: My GMB listing is being filtered due to co-location. What should I do?

A: This topic has come to the fore especially since Google’s rollout of the Possum filter on Sept 1, 2016. Businesses at the same address (or even in the same neighborhood) that share a category and are competing for the same search phrases often have the disappointment of discovering that their GMB listing appears to be missing from the map while a co-located or nearby competitor ranks highly. Google’s effort to deliver diversity causes them to filter out companies that they deem too similar when they’re in close proximity to one another. If you find yourself currently in a scenario where you happen to be sharing a building with a competitor, and you’ve been puzzled as to why you seem invisible on Google’s maps, zoom in on the map and see if your listing suddenly appears. If it does, chances are, you’re experiencing filtering. If this is your predicament, you have a few options for addressing it. As a measure of last resort, you could relocate your company to a part of town where you don’t have to share a location and have no nearby competitors, but this would be an extreme solution. More practically speaking, you will need to audit your competitor, comparing their metrics to yours to discover why Google sees them as the stronger search result. From the results of your audit, you can create a strategy for surpassing your opponent so that Google decides it’s your business that deserves not to be filtered out.

Summing Up

There’s nothing wrong with multiple businesses sharing an address. Google’s local index is filled with businesses in this exact situation ranking just fine without fear of penalization. But the key to success and safety in this scenario is definitely in the details. Assessing eligibility, accurately and honestly representing your brand, adhering to guidelines and best practices, and working hard to beat the filters will stand you in good stead.

https://www.businesscreatorplus.com/google-my-business-faq-for-multiple-businesses-at-the-same-address/

Sunday, February 16, 2020

How Hackers May Be Hurting Your SEO via @natalieannhoben

how-hackers-may-be-hurting-your-seo-via-natalieannhoben.png
It is oftentimes rather easy to sometimes grow complacent as an SEO when it comes to site security, or put all of the responsibility on I.T. departments when it comes to any form of cybersecurity or hacking prevention practices. It’s a debatable topic amongst many, however, this is defiantly true: Website security, or the absence of it, can directly and critically impact a site, and that includes the site’s organic performance. For this reason, website security should not be ignored when it comes to digital marketing plans. But first, let’s gain a deeper understanding of what hacking, it itself, is, in order to connect the dots on why it should not be neglected.

What Is Hacking?

Hacking occurs when an individual gains access to a specific website or computer network, sans permission. Unwarranted hacking most often occurs when people are trying to gain access to sensitive or private information, or to redirect users to a specific hacker’s website.

What Are Some Common Tools Utilized by Hackers?

Malware

Malware is specifically designed to damage or disable a specific network, with the goal usually being a data breach. The potential after-effects of a malware attack can be great, including extensive financial losses for an organization.

Spamming

Website spamming usually occurs when a hacker adds hypertext to a webpage that, when clicked on by a user, will link to the hacker’s chosen destination. Adding spammy links to a hacker’s website on websites that have a high amount of traffic to them has a chance of increasing search engine rankings. It is essentially a way to shortcut the system of solidified, ethical SEO work.

Effects of Hacking

The ramifications of hacking can be significant and far-reaching. There are a few more common things that can happen when a website is hacked.

SEO Spam

GoDadddy conducted a study a few years ago where they concluded that over 73% of hacked websites were hacked due to SEO spam reasons. Something like this could be planned and deliberate, or an attempt to scrape a website that is authoritative and capitalize on strong rankings and visibility. In most cases, legitimate sites are ultimately turned into link farms and visitors are tricked with phishing or malware links. Hackers may also employ that use of SQL injections, where a site will be turned over with spam and recovery may be very difficult.

Malicious Code

This can potentially put your website in the sandbox if Google detects it. If detected, Google will display a warning message when users try to navigate to the site, and therefore encouraging them to stay away. It can also potentially result in the complete removal of a site from search engines in an effort to safeguard users. This will both, directly and indirectly, influence SEO value:
  • Visits: Overall organic site traffic will most likely drop significantly.
  • Engagement metrics: Metrics such as time on site, pages per session, and bounce rate will most likely be negatively affected, which will send negative signals to Google in terms of user experience factors.
  • Mistrust: Users who know that your site may be less enticed to visit again if they know that your site has had one or multiple security issues, thus also affecting your traffic, and ultimately, your bottom line.

Unplanned Redirects

Oftentimes, hackers will implement redirects when a website is hacked. These will send users to a different website than the one that they navigated to initially. When users are directed to this separate web address, they will usually find that the site contains:
  • Malicious forms of content such as duplicate content that isn’t true.
  • Other types of scams like phishing where users are enticed to click on a spammy link and ultimately reveal sensitive information.
If Google follows your site that has been redirected and sees that it contains questionable content, it may severely hurt overall organic visibility in search.

Backlinks

Search engines carefully assess the overall reputation and value of domains and links that link to one another. During a hack, links will oftentimes be added to a site, and most likely ones with low value, which can negatively affect SEO efforts. Your website may ultimately be flooded with backlinks from questionable sources, which will most likely decrease the level of trust Google or other search engines has in a site.

Blacklisting

Being hacked can put a site at a serious detriment in Google’s eyes. This can affect a site’s presence in SERPs and also result in potentially several manual actions in Search Console if Google flags it. The kicker is, is that oftentimes they do not. This usually only leads to more attacks, such as via malware, without the webmaster knowing, and puts the site at risk for an even greater loss, both from a visibility and revenue standpoint. This creates a bit of a conundrum. Being flagged or blacklisted for malware essentially depletes your site’s visibility across the board, at least until the site is analyzed and cleaned and penalties removed. Yet, not getting flagged when your site contains malware can result in greater risk and penalization.

Common Risks & How to Prevent Attacks

There are a few more common things that put your site at a greater risk of getting hacked:

Installing Plugins or Other Tools From Untrusted Sources or Not Updating Them

Many plugins, such as those used in a CMS such as WordPress, are not all secure. Hackers are consistently searching for sites that use insecure or outdated plugins and then finding ways to exploit the site. As a best practice, it is recommended to research a plugin and read reviews before installing it on your site.

Sharing a Server May Also Pose a Risk in Terms of Site Security

This is because someone could easily upload a spammy or malicious file, or even grant access to other hackers.

Non-Secure Credentials May Also Pose a Risk for Data Security

It is recommended that secure passwords are created for online accounts and make them difficult to guess. Another more advanced method to prevent an attack is through penetration testing. This analyzes and tests your network’s security and any potential vulnerabilities within it.

Conclusion

Everyone is affected by web security. When building a partnership with a website or client, SEOs should be able to provide some advice when it terms to overall security. If you’re responsible for the SEO effectiveness of a site, part of your role is to ensure that there are security measures in place to protect it.  

https://www.businesscreatorplus.com/how-hackers-may-be-hurting-your-seo-via-natalieannhoben/

Saturday, February 15, 2020

Google’s New Partner Program Requirements Show No Love for Agencies & This Week’s News [PODCAST] via @shepzirnheld

googles-new-partner-program-requirements-show-no-love-for-agencies-this-weeks-news-podcast-via-shepzirnheld.jpg
Join Jess Budde, Greg Finn, and Christine “Shep” Zirnheld for Marketing O’Clock. We’re breaking down these digital marketing news stories:

Google’s new Partner requirements

On this week’s episode of Rant O’Clock, Google is making Partners jump through hoops if they want to retain their badges. Spend requirements doubled, new certifications are required, and Google will consider campaign optimization scores when determining Partner status.

Active advertisers on Pinterest doubled in 2019

Pinterest’s Q4 2019 performance data is here and it’s very impressive. They saw more users, more features, and more user adoption of those features, meaning more opportunity for advertisers.

Google search update was non-core

Danny Sullivan confirms there was an update to Google Search last week, but that it was a non-core update. Some members of the search community are still reeling from the aftereffects.

Microsoft Advertising has a new feature IF you want to customize text ads

Learn how to use Microsoft’s new IF function to customize your ads based on user device or audience.

Say goodbye to ISP and Network Service Provider data in Google Analytics

Google Analytics no longer supports ISP and Network Service Provider data dimensions. Jess tells you how this change will affect reports and filters in GA. Then, In our take of the week segment, we’re hosting a take-off of fiery tweets about the new Google Partner requirements. We’ll answer all these marketing questions during our lightning round segment:
  • Who found two handy new Google Merchant Center reports this week?
  • What Google Ads reports now include change history information?
  • When is IGTV monetization coming to Instagram?
  • Where should you go for Valentine’s Day dinner, according to Yelp showcase ads?
  • Why are some Google Posts being rejected?
  • How can you improve your local ranking on Google?
Plus, we play Margo Polo as we discuss a movie title change in the name of SEO. Like what you heard? Head over to the Marketing O’Clock site to read today’s articles and subscribe. Thank you to our sponsors!
  • Ahrefs – An all-in-one SEO toolset that gives you the tools you need to rank your website in Google and get tons of search traffic.
  • Opteo – Helps Google Ads managers automate time-consuming manual tasks so they can spend more time on high-level strategy and creative work.

Image Credits Featured Image: Cypress North

https://www.businesscreatorplus.com/googles-new-partner-program-requirements-show-no-love-for-agencies-this-weeks-news-podcast-via-shepzirnheld/

Friday, February 14, 2020

Bayer Strives to Settle Lawsuits While Roundup Still on the Market

Article in the Wall Street Journal -- Bayer Strives to End Lawsuits Over Roundup—While Still Selling It, by Laura Kusisto, Ruth Bender, and Jacob Bunge.

https://lawprofessors.typepad.com/mass_tort_litigation/2020/02/bayer-strives-to-settle-lawsuits-while-roundup-still-on-the-market.html

https://www.forlawfirmsonly.com/bayer-strives-to-settle-lawsuits-while-roundup-still-on-the-market/

Litigation Update: New Jersey Jury Delivers a $186 MM Punitive Damages Verdict Against J&J

On February 6, Johnson & Johnson was ordered by a New Jersey jury to pay $750 MM in punitive damages to four plaintiffs suffering from mesothelioma as a result of their use of J&J’s asbestos-contaminated baby powder; the parties had previously been awarded $37.3 MM in compensatory damages after a two-month long trial before a separate jury in September, 2019.  The verdict followed several days of testimony in which plaintiffs alleged J&J knowingly sold asbestos-tainted baby powder and acted maliciously or in wanton and willful disregard; moreover, it was alleged that J&J had hidden evidence of the contamination with reckless indifference to the consequences. J&J CEO Alex Gorsky was compelled to testify in the punitive damages phase of the trial about statements he made on national TV in 2018 contending the company’s baby powder was safe; he stated he had relied on internal experts in making those claims. In his closing, plaintiffs’ counsel Chris Panatier had urged the jurors to award an amount of punitive damages that was not a number (the company) “would laugh at”. The trial was held in J&J’s backyard in New Brunswick and jurors deliberated for about two hours before reaching their conclusion.  Citing New Jersey law, the presiding judge, Judge Ana C. Viscomi, reduced the verdict to $186.5 MM, or an amount approximately five times the compensatory damages that had been previously awarded. Click on this link for information on Verus’ Mass Tort Services. To contact us, fill out this form or email us at info@verusllc.com and we will reply immediately. |CONTACT US| |REQUEST PROPOSAL|
litigation-update-new-jersey-jury-delivers-a-186-mm-punitive-damages-verdict-against-jj.jpg
Manager of Research Services
609-466-0427 As the Manager of Research Services at Verus, Kim uses both historical documents and current analysis to conduct ongoing research into the products and exposure sites for each of the Trusts administered by Verus. MORE

https://www.forlawfirmsonly.com/litigation-update-new-jersey-jury-delivers-a-186-mm-punitive-damages-verdict-against-jj/

9 Reasons to Use Instagram for Business via @KristiKellogg

9-reasons-to-use-instagram-for-business-via-kristikellogg.jpg
With more than 25 million Instagram business accounts and more than $7 billion spent on Instagram advertising last year, it’s clear that brands are investing in this channel. Others, however, remain on the sidelines, and one of the chief reasons is that they (incorrectly) believe they have nothing to post. This couldn’t be further from the truth. Whether you’re a trendy B2C company or a traditional B2B company, there’s a place for you on the ‘gram. Don’t fall into the trap of thinking your business doesn’t have anything visually interesting to post. With a little creativity and strategic planning, you’ll find there’s plenty you can post on Instagram. Here are nine reasons you should use Instagram as a business, no matter what your industry is.

1. Customers Expect It

Customers will search for you on Instagram. They might search for you specifically by name, or by hashtags relating to your business or location. Either way, not finding you is a bad experience. Even if you don’t plan to be incredibly active on Instagram, the best practice is to create an account that, at the very least, has your business name, contact information, and a few posts to showcase your brand. In any case, you don’t want that search to come up empty – or, even worse, lead them to a competitor.

2. It’s a Trust Signal

Having an Instagram account – especially a verified Instagram account – is one more signal that your business is reputable, real, and transparent. If you’re doing online business exclusively, having yet another social account where your customers can get to know your business is highly valuable.

3. Your Customers & Users Can Tag You

Let’s say you offer a client exceptional service, or a customer is over-the-moon about a product they just bought from you. It’s highly possible they’ll take to Instagram to share the story, and their glowing review is gold. That’s the kind of thing you definitely want to be tagged in so that it can show up on your Instagram account. But here’s the thing – if you don’t have an Instagram account to begin with, the customer will never be able to tag you and it’s a huge missed opportunity.

4. You Can Tag and Sell Your Products

If you sell products, the ability to share Instagram photos and  videos that link directly to those products is a major win. To take advantage of this feature, you need to create a product catalog from your Facebook page (that’s where Instagram pulls the product info from).

5. Point Back to Your Site

It isn’t just products you can showcase from your Instagram account – it’s also your white papers, infographics, blog posts, and any other content you post on your website. Instagram is one more social channel where you can drive traffic back to your site. You can toggle sharing to Facebook or Twitter on directly from Instagram, as well, or use a social dashboard that lets your post the same message to different social channels with a couple of clicks.

6. Online Reputation Management

When it comes to online reputation management and search engine optimization for your brand name, having an Instagram account is a must-do. Your online reputation is critical to your business, and for that reason, you have to be vigilant about what comes up when customers search for your business or brand’s name. In addition to your website, your social channels usually show up on the first page of the search engine results, as well. For that reason, it’s a best practice to create business accounts on all social networks (including Instagram) with your brand name, even if you don’t plan on using them frequently. The idea here is to control the search engine results page as much as you can by creating profiles and content that points back to your brand. That way, in the event you do get bad publicity online, your website and social accounts have a fighting chance to rank above any negative content.

7. Your Competitors Are on Instagram

If you don’t have an Instagram account and your competitors do, you’re giving them a competitive edge, plain and simple. If you’re stumped on what kind of content to post to Instagram, look at what your competitors are doing. It’ll give you plenty of inspiration for what you can do, too!

8. Networking

Instagram is also a useful networking tool. You can like, comment and send messages to other like-minded businesses or individuals and form a relationship with them over time with meaningful interactions. That way, if you ever want to reach out to them in the real world or run into them at an event, you’ll have already laid a foundation on Instagram.

9. You Can Attract Talent

Whenever people are considering working with or for a new company, they want to know what it’s really like. Giving them a transparent glimpse of behind-the-scenes moments on Instagram (and social media in general) is a great way to show off your company culture.
Featured Image Credit: Paulo Bobita

https://www.businesscreatorplus.com/9-reasons-to-use-instagram-for-business-via-kristikellogg/

Litigation Update: Opioid MDL “Negotiation Class”

On Friday, February 7, arguments were underway in the 6th Circuit Court of Appeals as drug distributors and municipalities sought to overturn a federal district judge’s certification of a ”negotiation class” tasked with achieving a global settlement between the pharmaceutical companies and the 30,000 cities and counties in the United States affected by the widespread opioid crisis. The negotiation class was approved in September, 2019 by Judge Polster who is presiding over the opioid MDL in federal district court in Ohio; it presents a novel approach to addressing the crisis that has claimed tens of thousands of lives across the country. In their opening briefs, a number of pharmaceutical companies and distributors, including Amerisource Bergen Drug Corp., McKesson Corp., Cardinal Health, Walgreen Co. and CVS Health Corp along with six Ohio cities opposed the certification of the negotiation class, arguing that Rule 23 of the Federal Rules of Civil Procedure do not authorize it; the cities also contended that the attorneys who appear in both the bellwether trials and the negotiation class have irreconcilable conflicts of interest. The cities also argued that the complicated certification order puts too much financial pressure on the various municipalities to settle. Appellees briefs are due in late March. Click on this link for information on Verus’ Mass Tort Services. To contact us, fill out this form or email us at info@verusllc.com and we will reply immediately. |CONTACT US| |REQUEST PROPOSAL|
litigation-update-opioid-mdl-negotiation-class.jpg
Manager of Research Services
609-466-0427 As the Manager of Research Services at Verus, Kim uses both historical documents and current analysis to conduct ongoing research into the products and exposure sites for each of the Trusts administered by Verus. MORE

https://www.forlawfirmsonly.com/litigation-update-opioid-mdl-negotiation-class/

Thursday, February 13, 2020

FDA Public Hearings Held on Asbestos Testing for Talc Products and Cosmetics

On February 4, for the first time in almost fifty years, the FDA held a hearing examining the standards used for testing talc-containing products for asbestos and other potentially harmful contaminants. In response to recent reports that asbestos had been found in several talc-containing products, including a bottle of Johnson & Johnson’s Baby Powder, the FDA reviewed standards that had been recently propounded by a panel of government experts. This panel reviewed positions taken by public health authorities in the past as well as arguments put forth by plaintiffs in lawsuits alleging they developed cancer as a result of exposure to asbestos-contaminated talc products. The agency heard from consumer advocates, industry representatives and testing experts. In a written report, the panel observed that the testing methods used by the talc industry had “long-observed shortcomings in specificity and sensitivity” and made a number of recommendations, including one that minerals other than asbestos that were small enough to be breathed into the lungs should also be considered potentially harmful since they could result in “similar pathological outcomes”.  The Personal Care Products Council, an industry trade group of about 600 companies, has voiced opposition to this interpretation, arguing that including other mineral particles as possible carcinogens is not supported by science. The cosmetic talc industry has been permitted to police itself for decades with almost no oversight from the FDA; as a result, manufacturers had never been required to test their talc products for asbestos.  A report published in December by Reuters showed that since the 1970s the FDA had minimized any potential health concerns that may have been caused by the presence of asbestos in talc powder products, including cosmetics, and deferred repeatedly to manufacturers.  At the recent public hearing, consumer advocates and government experts argued that this era of self-regulation needed to end, that warning labels should be added to talc products, and that more stringent testing methodologies must be adopted. Click on this link for information on Verus’ Mass Tort Services. To contact us, fill out this form or email us at info@verusllc.com and we will reply immediately. |CONTACT US| |REQUEST PROPOSAL|
fda-public-hearings-held-on-asbestos-testing-for-talc-products-and-cosmetics.jpg
Manager of Research Services
609-466-0427 As the Manager of Research Services at Verus, Kim uses both historical documents and current analysis to conduct ongoing research into the products and exposure sites for each of the Trusts administered by Verus. MORE

https://www.forlawfirmsonly.com/fda-public-hearings-held-on-asbestos-testing-for-talc-products-and-cosmetics/

Google Confirms: No Core Update via @martinibuster

google-confirms-no-core-update-via-martinibuster.png
There have been many discussions about movements in the search results. Some  have declared a February 2020 update. They were wrong. 

Danny Sullivan Comments on February 2020 Update

Someone asked for a clue about what the so-called update was about. Danny Sullivan responded that Google updated all the time and linked to a 2019 tweet that explained the difference between a daily update and a core algorithm update. This is Danny’s response: This is the tweet from 2019 that Danny linked to that explains the difference :

Core Algorithm Update

A core algorithm update is a major event where significant changes are made to Google’s algorithm. This can mean an addition to the algorithm, usually in how Google determines what users mean when they make a query and what words on a web page mean.

Daily Update

Whatever fluctuation was felt was due to the daily updates that happen all the time. Google performs updates to their algorithm every single day.  For example,

Usual Indicators of an Update

Ordinarily there are many discussions on social media during a major update. For example, when an update hits a specific niche hard, one will see an uptick in social media posts clustering from that niche. But that kind of activity was missing. Facebook groups I follow were likewise missing the panicked chatter typical of updates. I know hundreds of search marketers and agency owners and not one contacted me about it.

The No Update Update

Without a doubt people did experience movement in the search results, but those were a part of the updates that happen at Google every single day of every  week, in every month, all year long. Here is Danny Sullivan’s explanation:

What to Do if You Lost Rankings?

Wait

Wait a few days to see if the changes are reverted. It’s not unusual for Google to roll out a change then roll it back a little as it receives feedback.

Understand Relevance

Updates are often about relevance. Many publishers tend to review their sites and ask:
“What did I do wrong?”
When rankings are lost due to relevance issues, the better question to ask is:
“What did my competitor do right?”
That’s my takeaway from Google’s statement about there being “nothing to fix.” It means that the usual things that publishers believe need fixing, like low quality links, low quality content and so on, those aren’t the reason why a site lost rankings and that’s why there is nothing to fix. For example, if the query is “ice cream” and your page about “how to make ice cream” lost positions, it could be because Google is giving preference to pages that show where to buy ice cream or sites that review ice cream brands. In other words, your page is not relevant. How does that happen? Usually that happens when Google introduces an improved way to understand what users mean when they make a search query.

Possible Explanations?

Daily Updates

The number one explanation of what happened is that it is one of many updates that happen every week. This could be a combination of minor changes that together resulted in a change in how search results are displayed. That’s not a core algorithm update.

Google Search Console Update

Perhaps not coincidentally, Google updated it’s Google Search Console (GSC). The update was to send notices to publishers that their review structured data was broken and needed fixing. I fixed a number of outdated review structured data and experienced promotion to the featured snippets. The promotion to the top spot means that someone else may have lost traffic. So it’s possible that at least some of the SERP fluctuation was due to changes publishers were making to their websites.

Update to News

Around February 9th or 10th 2020 Google updated something related to the discover feed. It resulted in showing web pages from offensive sites that had previously been excluded by the algorithms. Could a change in the discover feed lead to less traffic to other sites? Possible.

Takeaway

The important thing to keep in mind is there is no major Google update. The sky isn’t falling. It’s just the normal fluctuations that happen every single day. The cries of update from certain quarters were a false alarm.

https://www.businesscreatorplus.com/google-confirms-no-core-update-via-martinibuster/