Notes for "Filter Bubble" Page Edits
editThroughout this page, all of my edits are bolded and some of my general thoughts, suggestions, and justifications for edits are interspersed and italicized. I am going to leave a summary of my edits and contributions here when I am done.
My general overview and thoughts on the article:
As many other users have discussed already, this article is related to a bunch of complex issues and interrelated concepts. I do not have anywhere near the whole article here in my sandbox. Instead, I have just listed the edits I am considering so far.
List of suggested edits for the “Filter bubble” Wikipedia page
(This list is organized by the order that the article’s headings appear, based on which sections I have tackled so far and ending with the current sources).
Notes for next time (today is 4/18) make sure to check the Arch med cite for relevance and find stable links for all cites.
Lead:
editFor the lead section I added a link to the well-being page and mentioned misinformation in order to frame and preview my contributions both to the ethical implications section and other sections that touch upon misinformation. The studies I cite regarding filter bubbles and their effects on health misinformation do not all explicitly use the term well-being, but I figured the term was a general enough, catch-all that expresses the variety of specific issues touched upon by the health-related sources I added (pseudo-science, suicide, reliance on internet for health information rather than medical professionals, etc.).
A filter bubble – a term coined by internet activist Eli Pariser – is a state of intellectual isolation that allegedly can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. The choices made by these algorithms are not transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal and addressable. The results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, and as a result have called into question the effects of the "filter bubble" phenomenon on user exposure to fake news and echo chambers, spurring new interest in the term, with many concerned that the phenomenon may harm democracy and well-being by making the effects of misinformation worse.[1] [2][3]
(Technology such as social media) “lets you go off with like-minded people, so you're not mixing and sharing and understanding other points of view ... It's super important. It's turned out to be more of a problem than I, or many others, would have expected.”
— Bill Gates in Quartz
Concept Section:
editGrammatical error(s):
Second to last paragraph: “Many people are unaware that filter bubbles even exist. This can be seen in an article [change on to from] The Guardian”.
Phrasing suggestions:
“Interviewing programmers at Google off the record journalist Per Grankvist found that user data used to play a bigger role in determining search results but that Google, through testing, found that the search query is by far the best determinator on what results to display.” This sentence may read clearer if the end was changed to “the best method for determining what results to display”
I think that moving a few paragraphs/ideas from the current concept section to the similar concepts section may help improve the article's organization, reading experience, and may help more accurately explain how the idea of a filter bubble originates with Pariser as an articulation of a phenomenon that is interrelated with many other factors involved with online polarization, privacy, disinformation, and confirmation bias.
Here are the paragraphs I think should be moved from concept to similar concepts:
"Other terms have been used to describe this phenomenon, including "ideological frames" and "the figurative sphere surrounding you as you search the internet". A related term, "echo chamber", was originally applied to news media, but is now applied to social media as well."
AND
"A filter bubble has been described as exacerbating a phenomenon that has been called splinternet or cyberbalkanization, which happens when the internet becomes divided up into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views. This concern dates back to the early days of the publicly accessible internet, with the term "cyberbalkanization" being coined in 1996."
Both of these points are highly relevant, but I think moving them to similar concepts would be more accurate; moving them may also help tidy up the organization of the article. Because the concept section meanders a bit to me, and it is also quite long, whereas as the similar concepts section is relatively short and could use more content.
Concept
editThe term was coined by internet activist Eli Pariser circa 2010 and discussed in his 2011 book of the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. He related an example in which one user searched Google for "BP" and got investment news about British Petroleum, while another searcher got information about the Deepwater Horizon oil spill, and noted that the two search results pages were "strikingly different".
Pariser defined his concept of a filter bubble in more formal terms as "that personal ecosystem of information that's been catered by these algorithms". An internet user's past browsing and search history is built up over time when they indicate interest in topics by "clicking links, viewing friends, putting movies in [their] queue, reading news stories", and so forth. An internet firm then uses this information to target advertising to the user, or make certain types of information appear more prominently in search results pages.
This process is not random, as it operates under a three-step process, per Pariser, who states, "First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune to get the fit just right. Your identity shapes your media." Pariser also reports:
According to one Wall Street Journal study, the top fifty Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons each. Search for a word like "depression" on Dictionary.com, and the site installs up to 223 tracking cookies and beacons on your computer so that other Web sites can target you with antidepressants. Share an article about cooking on ABC News, and you may be chased around the Web by ads for Teflon-coated pots. Open—even for an instant—a page listing signs that your spouse may be cheating and prepare to be haunted with DNA paternity-test ads.
Accessing the data of link clicks displayed through site traffic measurements determine that filter bubbles can be collective or individual.
As of 2011, one engineer had told Pariser that Google looked at 57 different pieces of data to personally tailor a user's search results, including non-cookie data such as the type of computer being used and the user's physical location.
Other terms have been used to describe this phenomenon, including "ideological frames" and "the figurative sphere surrounding you as you search the internet". A related term, "echo chamber", was originally applied to news media, but is now applied to social media as well.
Pariser's idea of the filter bubble was popularized after the TED talk he gave in May 2011, in which he gives examples of how filter bubbles work and where they can be seen. In a test seeking to demonstrate the filter bubble effect, Pariser asked several friends to search for the word "Egypt" on Google and send him the results. Comparing two of the friends' first pages of results, while there was overlap between them on topics like news and travel, one friend's results prominently included links to information on the then-ongoing Egyptian revolution of 2011, while the other friend's first page of results did not include such links.
In The Filter Bubble, Pariser warns that a potential downside to filtered searching is that it "closes us off to new ideas, subjects, and important information", and "creates the impression that our narrow self-interest is all that exists". It is potentially harmful to both individuals and society, in his view. He criticized Google and Facebook for offering users "too much candy, and not enough carrots". He warned that "invisible algorithmic editing of the web" may limit our exposure to new information and narrow our outlook. According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that they have the possibility of "undermining civic discourse" and making people more vulnerable to "propaganda and manipulation". He wrote:
A world constructed from the familiar is a world in which there's nothing to learn ... (since there is) invisible autopropaganda, indoctrinating us with our own ideas. —
Many people are unaware that filter bubbles even exist. This can be seen in an article on The Guardian, which mentioned the fact that "more than 60% of Facebook users are entirely unaware of any curation on Facebook at all, believing instead that every single story from their friends and followed pages appeared in their news feed." A brief explanation for how Facebook decides what goes on a user's news feed is through an algorithm which takes into account "how you have interacted with similar posts in the past."
A filter bubble has been described as exacerbating a phenomenon that has been called splinternet or cyberbalkanization, which happens when the internet becomes divided up into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views. This concern dates back to the early days of the publicly accessible internet, with the term "cyberbalkanization" being coined in 1996.
Platform studies section
editWhile algorithms do limit political diversity, some of the filter bubble is the result of user choice. A study by data scientists at Facebook found that for every four Facebook friends that share ideology, users have one friend with contrasting views. No matter what Facebook's algorithm for its News Feed is, people are simply more likely to befriend/follow people who share similar beliefs. The nature of the algorithm is that it ranks stories based on a user's history, resulting in a reduction of the "politically cross-cutting content by 5 percent for conservatives and 8 percent for liberals". However, even when people are given the option to click on a link offering contrasting views, they still default to their most viewed sources. "[U]ser choice decreases the likelihood of clicking on a cross-cutting link by 17 percent for conservatives and 6 percent for liberals." A cross-cutting link is one that introduces a different point of view than the user's presumed point of view, or what the website has pegged as the user's beliefs. A recent study from Levi Boxell, Matthew Gentzkow, and Jesse M. Shapiro suggests that online media isn't the driving force for political polarization. The paper argues that polarization has been driven by the demographic groups that spend the least time online. The greatest ideological divide is experienced amongst Americans older than 75, while only 20% reported using social media as of 2012. In contrast, 80% of Americans aged 18-39 reported using social media as of 2012. The data suggests that the younger demographic isn’t any more polarized in 2012 than it had been when online media barely existed in 1996. The study highlights differences between age groups and how news consumption remains polarized as people seek information that appeals to their preconceptions. Older Americans usually remain stagnant in their political views as traditional media outlets continue to be a primary source of news while online media is the leading source for the younger demographic. Although algorithms and filter bubbles weaken content diversity, this study reveals that political polarization trends are primarily driven by pre-existing views and failure to recognize outside sources.
I have added another study here that adds on to the discussion of demographic and ideological effects related to filter bubbles and their effects as well as expands the discussion to the study of personality factors.
A 2020 study from Germany utilized the Big Five Psychology model to test the effects of individual personality, demographics, and ideologies on user news consumption.[4]Basing their study on the notion that the number of news sources that users consume impacts their likelihood to be caught in a filter bubble—with higher media diversity lessening the chances—their results suggest that certain demographics (higher age and male) along with certain personality traits (high openness) correlate positively with number of news sources consumed by individuals. The study also found a negative ideological association between media diversity and the degree to which users align with right-wing authoritarianism. Beyond offering different individual user factors that may influence the role of user choice, this study also raises questions and associations between the likelihood of users being caught in filter bubbles and user voting behavior. [4]
The Facebook study found that it was "inconclusive" whether or not the algorithm played as big a role in filtering News Feeds as people assumed. The study also found that "individual choice", or confirmation bias, likewise affected what gets filtered out of News Feeds. Some social scientists criticized this conclusion though, because the point of protesting the filter bubble is that the algorithms and individual choice work together to filter out News Feeds. They also criticized Facebook's small sample size, which is about "9% of actual Facebook users", and the fact that the study results are "not reproducible" due to the fact that the study was conducted by "Facebook scientists" who had access to data that Facebook does not make available to outside researchers.
Though the study found that only about 15–20% of the average user's Facebook friends subscribe to the opposite side of the political spectrum, Julia Kaman from Vox theorized that this could have potentially positive implications for viewpoint diversity. These "friends" are often acquaintances with whom we would not likely share our politics without the internet. Facebook may foster a unique environment where a user sees and possibly interacts with content posted or re-posted by these "second-tier" friends. The study found that "24 percent of the news items liberals saw were conservative-leaning and 38 percent of the news conservatives saw [check this quote to make sure this grammar is correct] was liberal-leaning." "Liberals tend to be connected to fewer friends who share information from the other side, compared with their conservative counterparts." This interplay has the ability to provide diverse information and sources that could moderate users' views.
Similarly, a study of Twitter's filter bubbles by New York University concluded that "Individuals now have access to a wider span of viewpoints about news events, and most of this information is not coming through the traditional channels, but either directly from political actors or through their friends and relatives. Furthermore, the interactive nature of social media creates opportunities for individuals to discuss political events with their peers, including those with whom they have weak social ties". According to these studies, social media may be diversifying information and opinions users come into contact with, though there is much speculation around filter bubbles and their ability to create deeper political polarization.
I have created a new paragraph/subsection within the Platform Studies section regarding the use of social bots by researchers studying filter bubbles and polarization.
Social bots have been utilized by different researchers to test polarization and related effects that are attributed to filter bubbles and echo chambers.[5] [6] A 2018 study used social bots on Twitter to test deliberate user exposure to partisan viewpoints. [5] The study claimed it demonstrated partisan differences between exposure to differing views, although it warned that the findings should be limited to party-registered American Twitter users. One of the main findings was that after exposure to differing views (provided by the bots) self-registered republicans became more conservative, whereas self-registered liberals showed less ideological change, if none at all. A different study from The People's Republic of China utilized social bots on Weibo–the largest social media platform in China–to examine the structure of filter bubbles in regards to their effects on polarization.[6] The study draws a distinction between two conceptions of polarization. One being where people with similar views form groups, share similar opinions, and block themselves from differing viewpoints (opinion polarization) and the other being where people do not access diverse content and sources of information (information polarization). By utilizing social bots instead of human volunteers and focusing more on information polarization rather than opinion-based, the researchers concluded that there are two essential elements of a filter bubble: a large concentration of users around a single topic and a uni-directional, star-like structure that impacts key information flows.
In June of 2018, the platform DuckDuckGo conducted a research study on the Google Web Browser Platform.[7]--[this whole paragraph lacked citations, so here is at least a link to the DuckDuckGo study it mentions]. For this study, 87 adults in various locations around the continental United States googled three key words at the exact same time: immigration, gun control, and vaccinations. Even when in private browsing mode, most people saw results unique to them. Google included certain links for some that it did not include for other participants, and the News and Videos infoboxes showed significant variation. Google publicly disputed these results saying that Search Engine Results Page (SERP) personalization is mostly a myth. Google Search Liaison, Danny Sullivan, stated that “Over the years, a myth has developed that Google Search personalizes so much that for the same query, different people might get significantly different results from each other. This isn’t the case. Results can differ, but usually for non-personalized reasons.”[8] --[this cite should fix the "citation needed" request for this qoute]. [citation needed]
When filter bubbles are in place they can create specific moments that scientists call 'Whoa' moments. A 'Whoa' moment is when an article, ad, post, etc. appears on your computer that is in relation to a current action or current use of an object. Scientists discovered this term after a young woman was performing her daily routine, which included drinking coffee, when she opened her computer and noticed an advertisement for the same brand of coffee that she was drinking. "Sat down and opened up Facebook this morning while having my coffee, and there they were two ads for Nespresso. Kind of a 'whoa' moment when the product you're drinking pops up on the screen in front of you." "Whoa" moments occur when people are "found". Which means advertisement algorithms target specific users based on their "click behavior" in order to increase their sale revenue. "Whoa" moments can also ignite discipline in users to stick to a routine and commonality with a product.
Several designers have developed tools to counteract the effects of filter bubbles (see § Counter measures). Swiss radio station SRF voted the word filterblase (the German translation of filter bubble) word of the year 2016.
Academia studies and reactions section
editA scientific study from Wharton that analyzed personalized recommendations also found that these filters can actually create commonality, not fragmentation, in online music taste. Consumers reportedly use the filters to expand their taste rather than to limit it. Harvard law professor Jonathan Zittrain disputed the extent to which personalization filters distort Google search results, saying that "the effects of search personalization have been light". Further, Google provides the ability for users to shut off personalization features if they choose, by deleting Google's record of their search history and setting Google to not remember their search keywords and visited links in the future.
For the Academia studies and reactions section I have added a few articles that discuss how some scholars report filter bubbles don't have clear, shared definitions or rigorous empirical evidence across scholarly disciplines:
A study from Internet Policy Review addressed the lack of a clear and testable definition for filter bubbles across disciplines; this often results in researchers defining and studying filter bubbles in different ways.[9] Subsequently, the study explained a lack of empirical data for the existence of filter bubbles across disciplines[10] and suggested that the effects attributed to them may stem more from preexisting ideological biases than from algorithms. Similar views can be found in other academic projects which also address concerns with the definitions of filter bubbles and the relationships between ideological and technological factors associated with them.[11]
A study by researchers from Oxford, Stanford, and Microsoft examined the browsing histories of 1.2 million U.S. users of the Bing Toolbar add-on for Internet Explorer between March and May 2013. They selected 50,000 of those users who were active consumers of news, then classified whether the news outlets they visited were left- or right-leaning, based on whether the majority of voters in the counties associated with user IP addresses voted for Obama or Romney in the 2012 presidential election. They then identified whether news stories were read after accessing the publisher's site directly, via the Google News aggregation service, via web searches, or via social media. The researchers found that while web searches and social media do contribute to ideological segregation, the vast majority of online news consumption consisted of users directly visiting left- or right-leaning mainstream news sites, and consequently being exposed almost exclusively to views from a single side of the political spectrum. Limitations of the study included selection issues such as Internet Explorer users skewing higher in age than the general internet population; Bing Toolbar usage and the voluntary (or unknowing) sharing of browsing history selecting for users who are less concerned about privacy; the assumption that all stories in left-leaning publications are left-leaning, and the same for right-leaning; and the possibility that users who are not active news consumers may get most of their news via social media, and thus experience stronger effects of social or algorithmic bias than those users who essentially self-select their bias through their choice of news publications (assuming they are aware of the publications' biases).
Counter measures
editBy media
Theflipside news outlet should be added as a course, and potentially with this quote as well "keeps me grounded in reality and not just in my bubble".
Websites such as allsides.com [, theflipside.io,][12] and hifromtheotherside.com aim to expose readers to different perspectives with diverse content. Some additional plug-ins aimed to help people step out of their filter bubbles and make them aware of their personal perspectives; thus, these media show content that contradicts with their beliefs and opinions. For instance, Escape Your Bubble asks users to indicate a specific political party they want to be more informed about. The plug-in will then suggest articles from well-established sources to read relating to that political party, encouraging users to become more educated about the other party. In addition to plug-ins, there are apps created with the mission of encouraging users to open their echo chambers. UnFound.news offers an AI (Artifical Intelligence) curated news app to readers presenting them news from diverse and distinct perspectives, helping them form rationale and informed opinion rather than succumbing to their own biases. It also nudges the readers to read different perspectives if their reading pattern is biased towards one side/ideology. Read Across the Aisle is a news app that reveals whether or not users are reading from diverse new sources that include multiple perspectives. Each source is color coordinated, representing the political leaning of each article. When users only read news from one perspective, the app communicates that to the user and encourages readers to explore other sources with opposing viewpoints. Although apps and plug-ins are tools humans can use, Eli Pariser stated "certainly, there is some individual responsibility here to really seek out new sources and people who aren't like you."
As the popularity of cloud services increases, personalized algorithms used to construct filter bubbles are expected to become more widespread. Scholars have begun considering the effect of filter bubbles on the users of social media from an ethical standpoint, particularly concerning the areas of personal freedom, security, and information bias. Filter bubbles in popular social media and personalized search sites can determine the particular content seen by users, often without their direct consent or cognizance, due to the algorithms used to curate that content. Self-created content manifested from behavior patterns can lead to partial information blindness. Critics of the use of filter bubbles speculate that individuals may lose autonomy over their own social media experience and have their identities socially constructed as a result of the pervasiveness of filter bubbles.
Technologists, social media engineers, and computer specialists have also examined the prevalence of filter bubbles. Mark Zuckerberg, founder of Facebook, and Eli Pariser, author of The Filter Bubble, have expressed concerns regarding the risks of privacy and information polarization. The information of the users of personalized search engines and social media platforms is not private, though some people believe it should be. The concern over privacy has resulted in a debate as to whether or not it is moral for information technologists to take users' online activity and manipulate future exposure to related information.
For the Ethical Implications section I have added four sources which discuss scholarly views on the effects of filter bubbles and related ideas (echo chambers, etc.) on healthcare information and consumer behavior/beliefs. I chose to add this particular set of contributions here because the sentences that precede them mention "biased and misleading information" and it seemed like a logical place to include these sources about healthcare misinformation.
Since the content seen by individual social media users is influenced by algorithms that produce filter bubbles, users of social media platforms are more susceptible to confirmation bias, and may be exposed to biased, misleading information. Social sorting and other unintentional discriminatory practices are also anticipated as a result of personalized filtering. Some scholars have expressed concerns regarding the effects of filter bubbles on individual and social well-being, i.e. the dissemination of health information to the general public and the potential effects of internet search engines to alter health-related behavior.[1] [3] [2] [13] A 2019 multi-disciplinary book reported research and perspectives on the roles filter bubbles play in regards to health misinformation. Drawing from various fields such as journalism, law, medicine, and health psychology, the book addresses different controversial health beliefs (e.g. alternative medicine and pseudoscience) as well as potential remedies to the negative effects of filter bubbles and echo chambers on different topics in health discourse. A different 2016 study on the potential effects of filter bubbles on search engine results related to suicide found that algorithms play an important role in whether or not helplines and similar search results are displayed to users and discussed the implications their research may have for health policies.[2] Another 2016 study from the Croatian Medical journal proposed some strategies for mitigating the potentially harmful effects of filter bubbles on health information, such as: informing the public more about filter bubbles and their associated effects, users deciding to try an alternative search engine [to Google], and more explanation of the processes search engines use to determine their displayed results.[1]
In light of the 2016 U.S. presidential election scholars have likewise expressed concerns about the effect of filter bubbles on democracy and democratic processes, as well as the rise of "ideological media". These scholars fear that users will be unable to "[think] beyond [their] narrow self-interest" as filter bubbles create personalized social feeds, isolating them from diverse points of view and their surrounding communities. For this reason, it is increasingly discussed the possibility to design social media with more serendipity, that is, to proactively recommend content that lies outside one's filter bubble, including challenging political information and, eventually, to provide empowering filters and tools to users. A related concern is in fact how filter bubbles contribute to the proliferation of "fake news" and how this may influence political leaning, including how users vote.
Revelations in March 2018 of Cambridge Analytica's harvesting and use of user data for at least 87 million Facebook profiles during the 2016 presidential election highlight the ethical implications of filter bubbles. Co-Founder and whistleblower of Cambridge Analytica Christopher Wylie, detailed how the firm had the ability to develop "psychographic" profiles of those users and use the information to shape their voting behavior. Access to user data by third parties such as Cambridge Analytica can exasperate and amplify existing filter bubbles users have created, artificially increasing existing biases and further divide societies.
Filter bubbles have stemmed from a surge in media personalization, which can trap users. The use of AI to personalize offerings can lead to the user only viewing content that only reinforces their own viewpoints without challenging them. Social media websites like Facebook may also present content in a way that makes it difficult for the user to determine the source of the content, leading them to decide for themselves whether the source is reliable or fake. This can lead to people becoming used to hearing what they want to hear, which can cause them to react more radically when they see an opposing viewpoint. The filter bubble may cause the person to see any opposing viewpoints as incorrect and could allow the media to force views onto consumers.
See also
editNotes about the article’s current sources:
editFor source 18-c, there is both a video and an article from CNN with the title “What the Internet is hiding from you”. The video link works but does not support the claim 18-b (No reference in the video to Google having collected 10 year’s worth of data). 18-c references this same video, but actually the link needs to be replaced with this article of the same name.[14]
References
edit- ^ a b c Holone, Harald (2016). "The filter bubble and its effect on online personal health information". Croatian Medical Journal.
- ^ a b c Haim, Mario; Arendt, Florian; Scherr, Sebastian (2017). "Abyss or Shelter? On the Relevance of Web Search Engines' Search Results When People Google for Suicide". Health Communication.
- ^ a b Lavorgna, Anita; Ronco, Anna Di (2019). Medical misinformation and social harm in non science based health practices. Routledge.
- ^ a b Sindermann, Cornelia (2020). "Age, gender, personality, ideological attitudes and individual differences in a person's news spectrum: how many and who might be prone to "filter bubbles" and "echo chambers" online?". Heliyon.
- ^ a b Bail, Christopher; Argyle, Lisa; Brown, Taylor; Chen, Haohan; Hunzaker, M.B.F.; Lee, Jaemin (2018). "Exposure to Opposing Views can Increase Political Polarization: Evidence from a Large-Scale Field Experiment on Social Media" (PDF). SocArXiv.
{{cite journal}}
:|first4=
missing|last4=
(help) - ^ a b Min, Yong; Jiang, Tingjun; Jin, Cheng; Li, Qu; Jin, Xiaogang (2019-11-29). "Endogenetic structure of filter bubble in social networks". Royal Society Open Science. 6 (11): 190868. doi:10.1098/rsos.190868. ISSN 2054-5703. PMC 6894573. PMID 31827834.
{{cite journal}}
: CS1 maint: PMC format (link) - ^ "Measuring the Filter Bubble: How Google is influencing what you click". DuckDuckGo Blog. 2018-12-04. Retrieved 2020-04-15.
- ^ Statt, Nick (2018-12-04). "Google personalizes search results even when you're logged out, new study claims". The Verge. Retrieved 2020-04-15.
- ^ Bruns, Axel (2019-11-29). "Filter bubble". Internet Policy Review. 8 (4). doi:10.14763/2019.4.1426. ISSN 2197-6775.
- ^ Difranzo, Dominic (April 2017). "Filter bubbles and fake news". Crossroads. 23(3): 32–35 – via ResearchGate.
- ^ Davies, Huw C (2018-03-16). "Redefining Filter Bubbles as (Escapable) Socio-Technical Recursion". Sociological Research Online. 23 (3): 637–654. doi:10.1177/1360780418763824. ISSN 1360-7804.
- ^ "About Us". www.theflipside.io. Retrieved 2020-04-09.
- ^ Hesse, Bradford (2005). "Trust and Sources of Health Information: The Impact of the Internet and Its Implications for Health Care Providers: Findings From the First Health Information National Trends Survey". Arch Intern Med.
- ^ "What the Internet is hiding from you". www.cnn.com. Retrieved 2020-03-27.
This is a user sandbox of Glparks. You can use it for testing or practicing edits. This is not the sandbox where you should draft your assigned article for a dashboard.wikiedu.org course. To find the right sandbox for your assignment, visit your Dashboard course page and follow the Sandbox Draft link for your assigned article in the My Articles section. |