Editing
Talk:Corecursion
Jump to navigation
Jump to search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
{{WikiProject banner shell|class=C| {{WikiProject Computer science|importance=high}} }} {{todo}} ==Earlier discussion at recursion page== :<small>Add section title. —Nils von Barth ([[User:Nbarth|nbarth]]) ([[User talk:Nbarth|talk]]) 17:56, 25 July 2012 (UTC)</small> [[User:Malcohol/Corecursion|Here is a discussion]] (which was started at [[Talk:Recursion]]) about how the recursion page at that time failed to properly account for corecursion. --[[User:Malcohol|Malcohol]] 15:08, 6 June 2006 (UTC) == Duality? == I don't understand the definition ''corecursion is a type of operation that is dual to recursion'' because I don't know which duality it refers to. I know lots of dualities (e.g. min/max in lattices, point/line in projective geometry.) Is the particular duality referred to here something to do with reversing arrows in Category Theory? If so, perhaps the article should say something about it, or at least make reference to a page that does. [[User:Tom Duff|Tom Duff]] 19:48, 15 January 2007 (UTC) Also, what does [[bisimulation]] (one of the references) have to do with corecursion? [[User:Tom Duff|Tom Duff]] 19:51, 15 January 2007 (UTC) I added a reference to David Turner's "Total Functional Programming" paper, which might help: a significant part of it is the introduction and discussion of codata, corecursion, and coinduction as first-class duals to data, recursion, and induction. --[[User:Piet Delport|Piet Delport]] 15:27, 26 January 2007 (UTC) == Code Errors == The given python code from Turner's book is invalid, since it references an "add" variable which has not been defined. I don't have a copy of the book, but it would be helpful if someone could work out what it's meant to refer to and add (hah!) that in. 5-13-2010 <span style="font-size: smaller;" class="autosigned">—Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/72.248.210.37|72.248.210.37]] ([[User talk:72.248.210.37|talk]]) 14:38, 13 May 2010 (UTC)</span><!-- Template:UnsignedIP --> <!--Autosigned by SineBot--> * Huh? This article links to an article by Turner, which you can download and read at no charge, not a book. And the python code is from Raymond Hettinger's recipe on ActiveState. [[User:Obscuranym|Obscuranym]] ([[User talk:Obscuranym|talk]]) <span style="font-size: smaller;" class="autosigned">—Preceding [[Wikipedia:Signatures|undated]] comment added 12:12, 16 May 2010 (UTC).</span><!--Template:Undated--> <!--Autosigned by SineBot--> == Language == What language was this written in originally before it got run through Google's translator? (Don't bother telling me the person who wrote it knew English because I'm not having it!) <span style="font-size: smaller;" class="autosigned">—Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[Special:Contributions/221.232.82.218|221.232.82.218]] ([[User talk:221.232.82.218|talk]]) 01:28, 27 July 2010 (UTC)</span><!-- Template:UnsignedIP --> <!--Autosigned by SineBot--> : If you have some constructive criticism, I'd love to hear it. [[User:Obscuranym|Obscuranym]] ([[User talk:Obscuranym|talk]]) 05:54, 28 July 2010 (UTC) == Lazy or non-strict? == The article says: : Even if the result is finite, this example depends on lazy evaluation due to the use of self-referential data structures. I think non-strict evaluation (not necessarily lazy) is enough; other non-strict evaluation strategies, such as lenient evaluation, might suffice. See http://www.haskell.org/haskellwiki/Lazy_vs._non-strict . [[Special:Contributions/187.46.235.5|187.46.235.5]] ([[User talk:187.46.235.5|talk]]) 22:40, 8 October 2011 (UTC) : You're right (although outside of the Haskell community "lazy" is often used synonymously with "non-strict", i.e. without the sharing requirement). —''[[User:Ruud Koot|Ruud]]'' 11:35, 10 October 2011 (UTC) == Major expansion: non-technical introduction, imperative examples (Python) == I’ve just completed a significant expansion (more than doubling) of the article, as of [http://en.wikipedia.org/w/index.php?title=Corecursion&oldid=504128625 this edit]. This consists essentially of: * Rewriting and expanding the non-technical lede (introduction), * Giving two detailed example, contrasting with recursion: factorial and tree traversal, implemented imperatively in Python, (together with various cleanup and formatting, and a brief note on history, AFAICT). Corecursion seems a very interesting topic, and perhaps a useful fundamental viewpoint, which appears to be only recently appreciated and not yet widely-known (certainly not outside functional programming community). I am no expert on this subject – I just reworked the existing article (discussion and Haskell examples), expanding the wording and giving explicit examples, and looked at the references. The non-technical discussion is perhaps a bit glib (it very sharply delimits the recursion/corecursion duality), but hopefully accurate and readable. The key points – corecursion useful for infinite data, and works ''forward'' (''synthetically,'' in contrast to ''analytically,'' as regular recursion) – seem easy enough to state clearly in non-technical terms. As for the examples: * Factorial is very simple – it’s basic recursion example, and simpler than Fibonacci, so seems a good base case from the recursion point of view * Tree traversal seems a great contrast example – depth-first traversal is fundamental non-trivial example of recursion (and very well-known), while breadth-first traversal is a basic motivating example of corecursion, which thus should have similar prominence. As this topic is potentially very abstract, and functional programming is often unfamiliar, concrete examples with imperative code (the Python is virtually pseudo-code, with the only confusing point being “yield”) should clarify it. Hope this helps, and please improve as you see fit! :—Nils von Barth ([[User:Nbarth|nbarth]]) ([[User talk:Nbarth|talk]]) 17:50, 25 July 2012 (UTC) : In my opinion the difference between recursion and corecursion is best demonstrated in languages that make an explicit distinction between them (e.g. Coq or Agda). In Haskell the distinction is vague (because—ignoring strictness annotations—all data is codata), imperative language are perhaps even worse because recursion is less "natural" in those languages and they don't have algebraic data types. In Coq a "recursive function" is one that operates by structural recursion over data (i.e. its domain is data), while a "corecursive function" is one that has codata as it's result/range and operates by guarded recursion. Perhaps a strict function language such as ML or Scheme work better as all data their data is data and have to codata by explicit delay and forces. —''[[User:Ruud Koot|Ruud]]'' 19:22, 25 July 2012 (UTC) == Related concepts? == I’d be interested in related concepts for context and to help people more familiar with other areas. Notably, in set theory/logic terms, recursion/corecursion sound similar to the [[recursive set]]/[[recursively enumerable set]], in the the former is “defined by input” while the later is “defined by output” – corecursion and recursively enumerable both sound basically like “what you can get as output”. Corecursion also feels ''very'' similar to [[tail recursion]], or perhaps better, a generalization – more precisely, tail recursion feels like doing recursion by having a ''backwards'' corecursion; the accumulator variable you often find in tail recursions in the give-away (the accumulator is the corecursive sequence). For example, computing the factorial via tail recursion actually consists of computing the [[falling factorial]] corecursively, and then outputting the end of this sequence (which is finite if you start at a natural number). Indeed, user [http://stackoverflow.com/users/849891/will-ness Will Ness] at StackOverflow claims that corecursion is basically (exactly?) [[tail recursion modulo cons]] (e.g., [http://stackoverflow.com/a/10874351 this answer], July 2012). As intriguing as this may seem, I didn’t find any substantiation of this in a quick literature/Google search, so this may be completely mistaken, and in any case is unsourced speculation. If anyone could find discussions of corecursion that are not just in abstract functional programming terms, they would be quite appreciated! :—Nils von Barth ([[User:Nbarth|nbarth]]) ([[User talk:Nbarth|talk]]) 18:30, 25 July 2012 (UTC) : Some related concepts: :* Codata (or coinductive types) are defined by [[coinduction]], that is as the [[greatest fixpoint]] of a set of equations. These types can model infinite structures (e.g. potentially infinite lists or [[stream (type theory)|stream]]s). In strict languages you have to do explicit [[lazy evaluation]] to work by those types, e.g. explicit delaying and forcing, or using generators. :* Regular data is defined inductively, that is as the [[least fixpoint]] of a set of equations. These types can only model finite structures (e.g. finite lists). : I don't immediately see the connection with tail recursion modulo cons. Perhaps it's related to guarded recursion? —''[[User:Ruud Koot|Ruud]]'' 19:32, 25 July 2012 (UTC) == Merge proposal == I propose merging [[Coinduction]] into [[Corecursion]], further in a way that removes almost all content from [[Coinduction]]. I have recently worked significantly on the latter page (rewriting almost all its contents), and added a section that explains what coinduction is. My reasons for merging are as follows: # The explanation of coinduction in that page is (in my opinion) not great and has attracted numerous complaints on its talk page over the years. Of course, I think the explanation I wrote in [[Corecursion]] is better, not least because it at least explains what coinduction ''is''. # Coinduction is impossible to understand without first understanding corecursion, and furthermore corecursion is very difficult to reason about without using coinduction. These two concepts are very very intimiately linked and I’m not sure why it needs two pages (after all, structural induction and structural recursion don’t have two separate pages). The proposal would end up removing all content on the [[Coinduction]] page apart from the paragraph about its use in logic programming, and some references from the further reading section. The rest of the information on the page is either about Pierce’s explanation, which many find confusing, or is duplicated by [[Corecursion]] already. [[User:Kestrer|Kestrer]] ([[User talk:Kestrer|talk]]) 07:27, 15 March 2026 (UTC)
Summary:
Please note that all contributions to Eurovision Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Eurovision Wiki:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Pages included on this page:
Template:Todo
(
edit
)
Template:WikiProject banner shell
(
edit
)
Navigation menu
Personal tools
Not logged in
Talk
Contributions
Create account
Log in
Namespaces
Page
Discussion
English
Views
Read
Edit source
Add topic
View history
More
Search
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Tools
What links here
Related changes
Page information