At the end of the day, webmasters just need to know their sites: chances are your analytics tool is more like a person than a software package, and will classify traffic in irrational ways. I’ve stumbled across website traffic originating from diverse and confusing sources being classed as direct — often requiring a considerable amount of thought and exploration to work out what is happening.
Regarding Link Detox, links it diagnoses as Toxic are generally fine as they're either not indexed by Google or have malware/viruses/etc., but I recommend a manual review of any diagnosed as Suspicious. I used it recently to get started cleaning up our backlinks and some legit sites and blogs were under Suspicious simply because they didn't have many links pointing to them.
Now, it is not that these sites are not interested in Google users. In fact, they have hired us to help them increase their share. However, they are getting so much traffic from sites like Facebook that it seems there is less urgency about attracting this traffic and less willingness to change the site to meet organic standards. Not long ago, sites would urgently and unquestioningly abide by Google’s standards to court that traffic.