Talk:AI winter

Latest comment: 5 months ago by 154.161.152.26 in topic FRENCH.

Where does the second AI date (1993) comes from?

edit

This date is everywhere on the internet but doesn't seems to corresponds to any specific event. Could we explain this date in more details in the article? — Preceding unsigned comment added by Jcolafrancesco (talkcontribs) 16:51, 27 December 2020 (UTC)Reply

There isn't really a specific date when the "winter" began -- things had been step-wise shrinking since 1988 (start of the "Fall"). It was obvious there was a serious problem by 1990. In the 90s, companies abandoned their AI projects (i.e., expert systems) one by one, as they ran into cost and maintenance problems. Funding agencies also cut funding one by one. By 1993, you had New York Times and Forbes running articles, from a business point of view, accusing AI of "hype". By 2000, it was "common knowledge" that AI was all hype. So there wasn't really a single "event" -- just a change of opinions of experts in the mainstream business press, based on the experiences of actual businesses over a ten-year period. It was a change of season, not an event. --- CharlesTGillingham (talk) 00:18, 20 August 2023 (UTC)Reply

Cause

edit

AI 'winter' necessarily follows from the spruiking of a concept having a basis only in the human imagination and no basis whatsoever in reality. There is no such thing as artificial intelligence. Never has been. And never will be. At least not by simulation implemented using digital computers. This is a necessary corollary arising from the study and understanding of the nature of deterministic computing. It is NOT possible for any deterministic computational system (eg an electronic digital computer) to emulate sentience (ie 'intelligence') in any way, shape or form. Persons who have studied and understoo\d computer science know this and do not expound 'artifical intelligence' using digital computers because they know that - by definition - it is NOT possible. And they possess intellectual honesty. However those who lack this understanding and/or do not possess intellectual honesty, or possess intellectual dishonesty DO expound the notion of 'artificial intelligence' using deterministic computing devices. Such people are examples of the phenomenon known as 'snake-oil salesmen'; that is, people seeking to enrich themselves by promoting a fraudulent notion among the ignorant. Currently there are very many persons promoting the notion of 'artificial intelligence' in a manifestly fraudulent manner only in order to enrich themselves.

The notion of 'artificial intelligence' is precisely, and only, that - a notion. As 'time travel' is a notion. Artificial intelligence has no [present] manifestation in reality because it CANNOT have a manifestation in [present] reality, just as time travel has no basis in [present -sic] reality because it CANNOT have any basis in [present] reality. — Preceding unsigned comment added by 122.151.210.84 (talk) 02:13, 27 June 2022 (UTC)Reply

I started this page in Dec of 2005 (Talk::AI Winter ). After some work, I left it to others. So, this is a nice culmination after 18 years. ... But, AI is not solely machine learning by itself. And, from the position of many, there was really no "winter" at the core. Funny money disappeared. I think that we ought to get a handle on the cyclic ways of life, even within busyness. ... Anyone for a true multidisciplinary view that will be necessary for us to tame the beats of the buckets-of-bits? Also science? I see it lacking quite obviously as imagination runs wild like the kids at spring break. jmswtlk (talk) 01:52, 12 December 2023 (UTC)Reply

Business

edit

Chloe 2400:AC40:60B:D5A3:C00E:2502:7F58:8C24 (talk) 22:50, 30 June 2022 (UTC)Reply

Remove redirect from Quantum winter

edit

I suggest that the redirect to this page from Quantum winter should be removed, and a page about the phenomenon called "Quantum winter" be created instead.

Se discussion on Talk:Quantum winter.

Liiiii (talk) 07:56, 26 March 2023 (UTC)Reply


AI Spring Update

edit

The current AI Spring has created some undeniable content inasmuch as the increasing cadence of development and explosion of competition to create the next big thing and pushback from industry and government to impede progress. Is this on anyone's radar? Hcsnoke (talk) 15:17, 11 April 2023 (UTC)Reply

Investment:250000,Cmp:457,Target:533,Stop:401 loss Find Qty Profit loss In bank nifty

edit

Investment:250000,Cmp:457,Target:533,Stop:401 loss Find Qty Profit loss In bank nifty 182.48.209.195 (talk) 18:29, 25 June 2023 (UTC)Reply

Cutting all this stuff

edit

Cutting this because (1) I think the article is better if tighten it just to cover the history. (2) These sections are each at very different levels of reliability and notability. It would take a good editor and a lot of research to figure out exactly what's notable and to make this section WP:comprehensive. (3) My (personal) impression is that there are almost as many theories as there were observers, and that there is no way to settle the issue definitively.

Anyone disagrees, feel free to take it on. ---- CharlesTGillingham (talk) 00:33, 20 August 2023 (UTC)Reply

Underlying causes behind AI winters

Several explanations have been put forth for the cause of AI winters in general. As AI progressed from government-funded applications to commercial ones, new dynamics came into play. While hype is the most commonly cited cause, the explanations are not necessarily mutually exclusive.

Hype

The AI winters can[citation needed] be partly understood as a sequence of over-inflated expectations and subsequent crash seen in stock-markets and exemplified[citation needed] by the railway mania and dotcom bubble. In a common pattern in the development of new technology (known as hype cycle), an event, typically a technological breakthrough, creates publicity which feeds on itself to create a "peak of inflated expectations" followed by a "trough of disillusionment". Since scientific and technological progress cannot keep pace with the publicity-fueled increase in expectations among investors and other stakeholders, a crash must follow. AI technology seems to be no exception to this rule.[citation needed]

For example, in the 1960s the realization that computers could simulate single-layer neural networks led to a neural-network hype cycle that lasted until the 1969 publication of the book Perceptrons which severely limited the set of problems that could be optimally solved by single-layer networks. In 1985 the realization that neural networks could be used to solve optimization problems, as a result of famous papers by Hopfield and Tank,[1][2] together with the threat of Japan's fifth-generation project, led to renewed interest and application.

Institutional factors

Another factor is AI's place in the organisation of universities. Research on AI often takes the form of interdisciplinary research. AI is therefore prone to the same problems other types of interdisciplinary research face. Funding is channeled through the established departments and during budget cuts, there will be a tendency to shield the "core contents" of each department, at the expense of interdisciplinary and less traditional research projects.

Economic factors

Downturns in a country's national economy cause budget cuts in universities. The "core contents" tendency worsens the effect on AI research and investors in the market are likely to put their money into less risky ventures during a crisis. Together this may amplify an economic downturn into an AI winter. It is worth noting that the Lighthill report came at a time of economic crisis in the UK,[3] when universities had to make cuts and the question was only which programs should go.

Insufficient computing capability

Early in the computing history the potential for neural networks was understood but it has never been realized. Fairly simple networks require significant computing capacity even by today's standards.

Empty pipeline

It is common to see the relationship between basic research and technology as a pipeline. Advances in basic research give birth to advances in applied research, which in turn leads to new commercial applications. From this it is often argued that a lack of basic research will lead to a drop in marketable technology some years down the line. This view was advanced by James Hendler in 2008,[4] when he claimed that the fall of expert systems in the late '80s was not due to an inherent and unavoidable brittleness of expert systems, but to funding cuts in basic research in the 1970s. These expert systems advanced in the 1980s through applied research and product development, but, by the end of the decade, the pipeline had run dry and expert systems were unable to produce improvements that could have overcome this brittleness and secured further funding.

Failure to adapt The fall of the LISP machine market and the failure of the fifth generation computers were cases of expensive advanced products being overtaken by simpler and cheaper alternatives. This fits the definition of a low-end disruptive technology, with the LISP machine makers being marginalized. Expert systems were carried over to the new desktop computers by for instance CLIPS, so the fall of the LISP machine market and the fall of expert systems are strictly speaking two separate events. Still, the failure to adapt to such a change in the outside computing milieu is cited as one reason for the 1980s AI winter.[4]


References

  1. ^ Hopfield, JJ; Tank, DW (July 1985). ""Neural" computation of decisions in optimization problems". Vol. 52. Biological Cybernetics. doi:10.1126/science.3755256.
  2. ^ Hopfield, JJ; Tank, DW (August 1986). "Computing with Neural Networks". Vol. 233, no. 4764. Science. doi:10.1126/science.3755256.
  3. ^ Muggleton, Stephen (2007-07-10). "Obituary: Donald Michie". the Guardian. Retrieved 2022-12-02.
  4. ^ a b Avoiding another AI Winter, James Hendler, IEEE Intelligent Systems (March/April 2008 (Vol. 23, No. 2) pp. 2–4

CharlesTGillingham (talk) 00:33, 20 August 2023 (UTC)Reply

A revert

edit

The article discusses the origin of the term in the second paragraph. I reverted an edit which changed:

  • Roger Schank and Marvin Minsky—two leading AI researchers who had survived the "winter" of the 1970s

To:

  • Roger Schank and Marvin Minsky—two AI researchers who had worked on AI in the 1970s

It's important that (1) they were leaders (2) they had already survived one "winter", because this is the "who" and "why" here, and it's essential. (Lots of people worked in AI in random periods of time in the past, so the second version doesn't actually say anything notable.) The reader needs to know that (1) these are people worth listening to, and that (2) they were speaking from experience.

You could argue that word "survived" is WP:peacock and tone it down to "experienced" or something. That would be fine. ---- CharlesTGillingham (talk) 00:17, 25 August 2023 (UTC)Reply

1) "Survived" is an extreme term, as there was very little chance of either of them losing their jobs. Experienced is better.
2) If dedication to who and why is essential, it is worth noting here that Schank's work in AI really began in the mid-1970s, which was either during or after the first Winter occurred (depending on your comfort with dates), after he became a professor in 1974, His notable work occurred after that. In fact, it could be stated that he was late to the party relative to "leaders" like Minsky, Feigenbaum, McCarthy, et al. Andreldritch (talk) 11:13, 25 August 2023 (UTC)Reply

English speech about raising challenge

edit

OR TAMBO 41.13.191.75 (talk) 17:56, 21 April 2024 (UTC)Reply

Sterling 49.177.23.51 (talk) 06:01, 23 May 2024 (UTC)Reply

FRENCH.

edit

Decrilez_la pour aider la police a la retriever. 154.161.152.26 (talk) 17:15, 30 May 2024 (UTC)Reply