| Skip to table of contents |
| This is thetalk page for discussing improvements to theProgramming language article. This isnot a forum for general discussion of the subject of the article. |
Article policies |
| Find sources: Google (books ·news ·scholar ·free images ·WP refs) ·FENS ·JSTOR ·TWL |
| Archives:1,2,3,4,5,6,7,8 |
| Programming language was one of theEngineering and technology good articles, but it has been removed from the list. There are suggestions below for improving the article to meet thegood article criteria. Once these issues have been addressed, the article can berenominated. Editors may also seek areassessment of the decision if they believe there was a mistake. | ||||||||||||||||||||||
| ||||||||||||||||||||||
| This It is of interest to the followingWikiProjects: | ||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||
| Asummary of this article appears inLanguage. |
| The content ofDialect (computing) wasmerged intoProgramming language on 7 February 2018. The former page'shistory now serves toprovide attribution for that content in the latter page, and it must not be deleted as long as the latter page exists. For the discussion at that location, see itstalk page. |
Archives |
| 1,2,3,4,5,6,7,8 |
Currently this article assumes that programming language is a phenomenon exclusive to machines in general, and computers in particular.
This assumption seems inappropriate, given that there is at least one widely-recognized counterexample: biological programming languages:
dr.ef.tymac (talk)18:13, 29 April 2017 (UTC)[reply]
It repeats many common places, many of them wrong!, but repeated again, and again, in many programming courses given in basic education based on outdated information. Also many "complete idiot's guide", "learn in N days" or "for dummies" like those in the photo with tech books, repeat again and again.
Many people know some programming language and write code. That does not make then an authority in the subject. However many of them feel they are.
This article seems written from notes taken in basic programming courses.
Has discussions like: How many angels can be in the tip of a needle?— Precedingunsigned comment added by201.124.211.115 (talk)05:53, 11 June 2017 (UTC)[reply]
yep too much information unrelated to the core subject and which does not help understanding. this needs to moved to relevant topics.
For example, the paragraph in FLOW-MATIC adds nothing to understanding what a computer language is and should be in the topic on Flow-matic, not programming.
there are also too many competing ideas, such as the definition of a programming language that just confuse things. My view is that Wikipedia should focus on commonly accepted facts and theories rather than pet issues insered by Academics to try and give exposure to very minority theories. it is a to help understanding, not a weapon in obscure Academic debates and personal obsessions.60.241.211.27 (talk)12:07, 28 July 2021 (UTC)[reply]
Wikipedia currently says a "programming language" is "a formal language that specifies a set of instructions that can be used to produce various kinds of output" which is true-ish, but vague. The phrase "various kinds of output" hints at the relevant characteristic, but still doesn't pin it down. Dictionary.com defines a "programming language" as "a high-level language used to write computer programs, as COBOL or BASIC, or, sometimes, an assembly language." This gibberish dances around it for a moment... but it makes no difference whether it is the highest-level symbolic meta-language or lowest-level machine code. An alternate Dictionary.com definition starts getting closer, "a simple language system designed to facilitate the writing of computer programs" but simplicity has nothing to do with it either and this definition still doesn't capture the essence. The important nugget here is the specification of decision making. A "computer language" is any predefined set of symbols and syntax that allows people to communicate with a computer system. But a "programming language" is a language among the broader set of computer languages that specifically enables a person to specify decision-making rules. CPUs make logical (true/false) decisions. The specification of a logical decision-making process is where the rubber meets the road. As example, HTML is a computer language, but not a programming language. If I want my computer's clock to display upside down, but only on Tuesdays, I can't use HTML to accomplish that. Get it? (HTML is really just a data markup language used to specify the metadata and the semantic structure of a Web document.)
Shall I take a crack at rewriting the first paragraph on the "programming language" page and I'll let you all have a look? I won't spend too much time on it unless the community wants me to, so let me know what you think.— Precedingunsigned comment added byDlampton (talk •contribs)00:39, 8 January 2018 (UTC)[reply]
@Squoop: you used some ref names that don't exist in the Abstractions section. Could you please fill those in? -- Fyrael (talk)21:58, 10 October 2022 (UTC)[reply]
A recentedit bySumanbalayar correctedif computational semantics is defined
toifa computational semantics is defined
, but it wasreverted byGirth Summit. Using "semantics" as singular is standard in the field, for example:
We first define the terms, then define a semantics showing how they behave, then give a type system that rejects some terms whose behaviors we don't like.
I favorSumanbalayar's version of this sentence.Freoh (talk)16:20, 5 January 2023 (UTC)[reply]
It would be helpful if there were a small set of "standard" tasks and all pages on programming languages showed how to do them all (e.g., compute prime numbers, compute the squares of the numbers 1 to 10, print "Hello, world!"). Where should I suggest this?LachlanA (talk)01:00, 21 January 2023 (UTC)[reply]
Oracle asserting something shouldn't be taken a evidence, needs better sources.FallingPineapple (talk)02:05, 7 June 2023 (UTC)[reply]
I think this article needs a thorough cleanup.
I feel like the article as a whole fails to provide accurate and useful information about programming languages.
I tried to improve the introduction but I'm sure my improvements still have issues...
Could a cleanup tag/tags be added? I'm not an experienced Wikipedian so I'm not sure what needs to be done.Squoop (talk)01:19, 28 October 2023 (UTC)[reply]
The source code examplehttps://en.wikipedia.org/wiki/Programming_language#/media/File:C_Hello_World_Program.png mentions thatsayHello is a function and in between brackets 'method'. This is not a method, a method is associated with an object/class, C has no classes.2A02:8389:2200:9F90:B5B9:15B4:D8AA:5901 (talk)09:54, 5 December 2023 (UTC)[reply]
| Thisedit request by an editor with a conflict of interest has now been answered. |
Extended content |
|---|
Programming languages are often placed into four main categories:imperative,functional,logic, andobject oriented.[1]
Althoughmarkup languages are not programming languages, some have extensions that support limited programming. Additionally, there are special-purpose languages that are not easily compared to other programming languages.[5] References
|
Sebesta, Robert W. (2012).Concepts of Programming Languages (10 ed.). Addison-Wesley.ISBN 978-0-13-139531-2.
Please add the following text at the end of the "elements" section:
Extended content |
|---|
Concurrency[edit]See also:Concurrent computing In computing, multiple instructions can be executed simultaneously. Many programming languages support instruction-level and subprogram-level concurrency.[1] By the twenty-first century, additional processing power on computers was increasingly coming from the use of additional processors, which requires programmers to design software that makes use of multiple processors simultaneously to achieve improved performance.[2]Interpreted languages such asPython andRuby do not support the concurrent use of multiple processors.[3] Other programming languages do support managing data shared between different threads by controlling the order of execution of key instructions via the use ofsemaphores, controlling access to shared data viamonitor, or enablingmessage passing between threads.[4] References
|
Buidhe paid (talk)17:48, 23 January 2024 (UTC)[reply]
| Thisedit request by an editor with a conflict of interest has now been answered. |
Please add the following text just before the "Concurrency" section:
Extended content |
|---|
Exception handling[edit]Main article:Exception handling Many programming languages include exception handlers, a section of code triggered byruntime errors that can deal with them in two main ways:[1]
Some programming languages support dedicating a block of code to run regardless of whether an exception occurs before the code is reached; this is called finalization.[2] There is a tradeoff between increased ability to handle exceptions and reduced performance.[3] For example, even though array index errors are common[4] C does not check them for performance reasons.[3] Although programmers can write code to catch user-defined exceptions, this can clutter a program. Standard libraries in some languages, such as C, use their return values to indicate an exception.[5] Some languages and their compilers have the option of turning on and off error handling capability, either temporarily or permanently.[6] References
|
ThanksBuidhe paid (talk)19:27, 23 January 2024 (UTC)[reply]
| Thisedit request by an editor with a conflict of interest has now been answered. |
Please change the content of the "Type system" section, after the hatnote, to:
Extended content |
|---|
Adata type is a set of allowable values and operations that can be performed on these values.[1] Each programming language'stype system defines which data types exist, the type of anexpression, and howtype equivalence andtype compatibility function in the language.[2] According totype theory, a language is fully typed if the specification of every operation defines types of data to which the operation is applicable.[3] In contrast, an untyped language, such as mostassembly languages, allows any operation to be performed on any data, generally sequences of bits of various lengths.[3] In practice, while few languages are fully typed, most offer a degree of typing.[3] Because different types (such asintegers andfloats) represent values differently, unexpected results will occur if one type is used when another is expected.Type checking will flag this error, usually atcompile time (runtime type checking is more costly).[4] Withstrong typing,type errors can always be detected unless variables are explicitlycast to a different type.Weak typing occurs when languages allow implicit casting—for example, to enable operations between variables of different types without the programmer making an explicit type conversion. The more cases in which thistype coercion is allowed, the fewer type errors can be detected.[5] Commonly supported types[edit]See also:Primitive data type Early programming languages often supported only built-in, numeric types such as theinteger (signed and unsigned) andfloating point (to support operations onreal numbers that are not integers). Most programming languages support multiple sizes of floats (often calledfloat anddouble) and integers depending on the size and precision required by the programmer. Storing an integer in a type that is too small to represent it leads tointeger overflow. The most common way of representing negative numbers with signed types istwos complement, althoughones complement is also used.[6] Other common types includeboolean—which is either true or false—andcharacter—traditionally onebyte, sufficient to represent allASCII characters.[7] Arrays are a data type whose elements, in many languages, must consist of a single type of fixed length. Other languages define arrays as references to data stored elsewhere and support elements of varying types.[8] Depending on the programming language, sequences of multiple characters, calledstrings, may be supported as arrays of characters or their ownprimitive type.[9] Strings may be of fixed or variable length, which enables greater flexibility at the cost of increased storage space and more complexity.[10] Other data types that may be supported includelists,[11]associative (unordered) arrays accessed via keys,[12]records in which data is mapped to names in an ordered structure,[13] andtuples—similar to records but without names for data fields.[14]Pointers store memory addresses, typically referencing locations on theheap where other data is stored.[15] The simplestuser-defined type is anordinal type whose values can be mapped onto the set of positive integers.[16] Since the mid-1980s, most programming languages also supportabstract data types, in which the representation of the data and operations arehidden from the user, who can only access aninterface.[17] The benefits ofdata abstraction can include increased reliability, reduced complexity, less potential forname collision, and allowing the underlyingdata structure to be changed without the client needing to alter its code.[18] Static and dynamic typing[edit]Instatic typing, all expressions have their types determined before a program executes, typically at compile-time.[3] Most widely used, statically typed programming languages require the types of variables to be specified explicitly. In some languages, types are implicit; one form of this is when the compiler caninfer types based on context. The downside ofimplicit typing is the potential for errors to go undetected.[19] Complete type inference has traditionally been associated with functional languages such asHaskell andML.[20] With dynamic typing, the type is not attached to the variable but only the value encoded in it. A single variable can be reused for a value of a different type. Although this provides more flexibility to the programmer, it is at the cost of lower reliability and less ability for the programming language to check for errors.[21] Some languages allow variables of aunion type to which any type of value can be assigned, in an exception to their usual static typing rules.[22] References
|
| Thisedit request by an editor with a conflict of interest has now been answered. |
Also, please add the following sentence to the end of the "proprietary languages" section: "Open source programming languages are particularly helpful foropen science applications, enhancing the capacity forreplication and code sharing."[1]
References
ThanksBuidhe paid (talk)02:23, 25 January 2024 (UTC)[reply]
![]() | Part of an edit requested by an editor with aconflict of interest has been implemented. |
Please:
Extended content |
|---|
One of the most important influences on programming language design has beencomputer architecture.Imperative languages, the most commonly used type, were designed to perform well onvon Neumann architecture, the most common computer architecture.[1] In von Neumann architecture, thememory stores both data and instructions, while theCPU that performs instructions on data is separate, and data must be piped back and forth to the CPU. The central elements in these languages are variables,assignment, anditeration, which is more efficient thanrecursion on these machines.[2] Many programming languages have been designed from scratch, altered to meet new needs, and combined with other languages. Many have eventually fallen into disuse.[citation needed] The birth of programming languages in the 1950s was stimulated by the desire to make a universal programming language suitable for all machines and uses, avoiding the need to write code for different computers.[3] By the early 1960s, the idea of a universal language was rejected due to the differing requirements of the variety of purposes for which code was written.[4] Tradeoffs[edit]Desirable qualities of programming languages include readability, writability, and reliability.[5] These features can reduce the cost of training programmers in a language, the amount of time needed to write and maintain programs in the language, the cost of compiling the code, and increase runtime performance.[6]
Programming language design often involves tradeoffs.[16] For example, features to improve reliability typically come at the cost of performance.[17] Increased expressivity due to a large number of operators makes writing code easier but comes at the cost of readability.[17] Natural-language programming has been proposed as a way to eliminate the need for a specialized language for programming. However, this goal remains distant and its benefits are open to debate.Edsger W. Dijkstra took the position that the use of a formal language is essential to prevent the introduction of meaningless constructs.[18]Alan Perlis was similarly dismissive of the idea.[19] Specification[edit]Main article:Programming language specification The specification of a programming language is an artifact that the languageusers and theimplementors can use to agree upon whether a piece ofsource code is a validprogram in that language, and if so what its behavior shall be. A programming language specification can take several forms, including the following:
Implementation[edit]Main article:Programming language implementation An implementation of a programming language is the conversion of a program intomachine code that can be executed by the hardware. The machine code then can be executed with the help of theoperating system.[23] The most common form of interpretation inproduction code is by acompiler, which translates the source code via an intermediate-level language into machine code, known as anexecutable. Once the program is compiled, it will run more quickly than with other implementation methods.[24] Some compilers are able to provide furtheroptimization to reduce memory or computation usage when the executable runs, but increasing compilation time.[25] Another implementation method is to run the program with aninterpreter, which translates each line of software into machine code just before it executes. Although it can make debugging easier, the downside of interpretation is that it runs 10 to 100 times slower than a compiled executable.[26] Hybrid interpretation methods provide some of the benefits of compilation and some of the benefits of interpretation via partial compilation. One form this takes isjust-in-time compilation, in which the software is compiled ahead of time into an intermediate language, and then into machine code immediately before execution.[27] References
|
Please add the following source to the "further reading" section:
Thank youBuidhe paid (talk)04:37, 27 January 2024 (UTC)[reply]
| − | [[Edsger W. Dijkstra]] took the position that the use of a formal language is essential to prevent the introduction of meaningless | + | [[Edsger W. Dijkstra]] took the position that the use of a formal language is essential to prevent the introduction of meaninglessconstructs. [[Alan Perlis]] was similarly dismissive of the idea. |
The issue with UNDUE hasn't been resolved, because due weight is based on coverage in reliable sources about "programming languages" in general. Many languages do have a standard library, but that fact by itself doesn't mean that it's DUE to cover them extensively in this high-level overview. "Natural language programming" never took off the ground, and the content about it is also UNDUE. (relevant discussion)Buidhe paid (talk)02:20, 13 April 2024 (UTC)[reply]
References
| Thisedit request by an editor with a conflict of interest has now been answered. |
Please change the content of the article lead (beginning below "use dmy dates" template) to:
Extended content |
|---|
Aprogramming language is a system of notation for writingcomputer programs.[1] Programming languages are described in terms of theirsyntax (form) andsemantics (meaning), usually defined by aformal language. Most languages have atype system consisting of differentdata types (such asintegers andstrings) and catchtype errors where one type is given where another is expected. Many supportuser-defined types includingabstract data types, often used forobject-oriented programming.Implementation in the form of acompiler orinterpreter allows programs to be translated intomachine code andexecuted. Computer architecture has strongly influenced the design of programming languages, with the most common type (imperative languages—which implement operations in a specified order) developed to perform well on the popularvon Neumann architecture. While early programming languages were closely tied to thehardware, over time, they have developed moreabstraction to hide implementation details for greater simplicity. Thousands of programming languages—often classified as imperative,functional,logic, orobject-oriented—have been developed for a wide variety of uses. Many aspects of programming language design involve tradeoffs—for example,exception handling reduces errors at a performance cost—such that efforts to develop a universal programming language have failed.Programming language theory is the subfield ofcomputer science that studies the design, implementation, analysis, characterization, and classification of programming languages. References
|
ThanksBuidhe paid (talk)08:57, 27 January 2024 (UTC)[reply]
| Thisedit request by an editor with a conflict of interest has now been answered. |
Please replace the content of the "History" section, after the hatnote, with:
Extended content |
|---|
Early developments[edit]The first programmable computers were invented at the end of the 1940s, and with them, the first programming languages.[1] The earliest computers were programmed infirst-generation programming languages (1GLs),machine language (simple instructions that could be directly executed by the processor). This code was very difficult to debug and was notportable between different computer systems.[2] In order to improve the ease of programming,assembly languages (orsecond-generation programming languages—2GLs) were invented, diverging from the machine language to make programs easier to understand for humans, although they did not increase portability.[3] Initially, hardware resources were scare and expensive, whilehuman resources were cheaper. Therefore, cumbersome languages that were time-consuming to use, but were closer to the hardware for higher efficiency were favored.[4] The introduction ofhigh-level programming languages (third-generation programming languages—3GLs)—revolutionized programming. These languagesabstracted away the details of the hardware, instead being designed to express algorithms that could be understood more easily by humans. For example, arithmetic expressions could now be written in symbolic notation and later translated into machine code that the hardware could execute.[3] In 1957,Fortran (FORmula TRANslation) was invented. Often considered the firstcompiled high-level programming language,[3][5] Fortran has remained in use into the twenty-first century.[6] 1960s and 1970s[edit]Around 1960, the firstmainframes—general purpose computers—were developed, although they could only be operated by professionals and the cost was extreme. The data and instructions were input bypunch cards, meaning that no input could be added while the program was running. The languages developed at this time therefore are designed for minimal interaction.[8] After the invention of themicroprocessor, computers in the 1970s became dramatically cheaper.[9] New computers also allowed more user interaction, which was supported by newer programming languages.[10] Lisp, implemented in 1958, was the firstfunctional programming language. Unlike Fortran, it supportsrecursion andconditional expressions,[11] and it also introduceddynamic memory management on aheap and automaticgarbage collection.[12] For the next decades, Lisp dominatedartificial intelligence applications.[13] In 1978, another functional language,ML, introducedinferred types and polymorphicparameters.[10][14] AfterALGOL (ALGOrithmic Language) was released in 1958 and 1960,[15] it became the standard in computing literature for describingalgorithms. Although its commercial success was limited, most popular imperative languages—includingC,Pascal,Ada,C++,Java, andC#—are directly or indirectly descended from ALGOL 60.[16][6] Among its innovations adopted by later programming languages included greater portability and the first use ofcontext-free,BNF grammar.[17]Simula, the first language to supportobject-oriented programming (includingsubtypes,dynamic dispatch, andinheritance), also descends from ALGOL and achieved commercial success.[18] C, another ALGOL descendant, has sustained popularity into the twenty-first century. C allows access to lower-level machine operations more than other contemporary languages. Its power and efficiency, generated in part with flexiblepointer operations, comes at the cost of making it more difficult to write correct code.[10] Prolog, designed in 1972, was the firstlogic programming language, communicating with a computer using formal logic notation.[19][20] With logic programming, the programmer specifies a desired result and allows theinterpreter to decide how to achieve it.[21][20] 1980s to present[edit]During the 1980s, the invention of thepersonal computer transformed the roles for which programming languages were used.[22] New languages introduced in the 1980s included C++, asuperset of C that can compile C programs but also supportsclasses andinheritance.[23]Ada and other new languages introduced support forconcurrency.[24] The Japanese government invested heavily into the so-calledfifth-generation languages that added support for concurrency to logic programming constructs, but these languages were outperformed by other concurrency-supporting languages.[25][26] Due to the rapid growth of theInternet and theWorld Wide Web in the 1990s, new programming languages were introduced to supportWeb pages andnetworking.[27]Java, based on C++ and designed for increased portability across systems and security, enjoyed large-scale success because these features are essential for many Internet applications.[28][29] Another development was that ofdynamically typedscripting languages—Python,JavaScript,PHP, andRuby—designed to quickly produce small programs that coordinate existingapplications. Due to their integration withHTML, they have also been used for building web pages hosted onservers.[30][31] During the 2000s, there was a slowdown in the development of new programming languages that achieved widespread popularity.[32] One innovation wasservice-oriented programming, designed to exploitdistributed systems whose components are connected by a network. Services are similar to objects in object-oriented programming, but run on a separate process.[33]C# andF# cross-pollinated ideas between imperative and functional programming.[34] After 2010, several new languages—Rust,Go, andSwift—competed for the performance-critical software for which C had historically been used.[35] References
|
Also, add the following source to the "further reading" section:
Reasons: add references, improve summary style, remove unsourced text, fix some MOS:CURRENT issuesBuidhe paid (talk)20:08, 28 January 2024 (UTC)[reply]
This article was the subject of a Wiki Education Foundation-supported course assignment, between30 January 2024 and10 May 2024. Further details are availableon the course page. Student editor(s):SirRiles (article contribs). Peer reviewers:ApolloMartin.
— Assignment last updated byKAN2035117 (talk)22:51, 3 April 2024 (UTC)[reply]
Summary of changes as a result of the Wiki99 project (before,after,diff):
Further possibilities for improvement:
Buidhe paid (talk)07:13, 5 August 2024 (UTC)[reply]
This article was the subject of a Wiki Education Foundation-supported course assignment, between3 September 2024 and13 December 2024. Further details are availableon the course page. Student editor(s):Pbblombo1! (article contribs). Peer reviewers:Sagethehero.
— Assignment last updated byKAN2035117 (talk)02:34, 30 October 2024 (UTC)[reply]
The "definitions" section isn't really... a definitions section at this point. I suggest this current section should be separated out into two sections: "Formal definitions" and "informal definitions." I'm happy to take it on, but it seemed like a big change so I figured I'd put it here for discussion for input before presuming to rewrite a bunch. I know this is long - but that's so that this could serve as a sort of rough rough draft for feedback in regard to how I think the section should be restructured if the objections aren't too strong - just trying to lay it all out.
I suggest this for a few reasons(also note that I am not working hard to hide any "bias" here, but would endeavor to steelman any viewpoints I wrote about in the actual article):
1) Formal Definitions are important but obscure/nuanced The more stringent, formal definitions are, frankly, not super useful outside of theoretical conversations (and I say this as someone who does in fact use those definitions). In computer science, the emphasis of a "programming language" was actually on the fact that it resembled *written language* (vs punch cards). For a long time the goal (and arguably marketing) was around developing programming language technology to the point that it would essentially reach natural speech. This distinction about input style is still extremely important, especially for contemporary discussions about LLM's, for example. But now that programming languages are abundant and common, most people do not find themselves making this distinction, leading to a sort of "drift" in the meaning of the term as its original meaning becomes *colloquially* irrelevant.
2) Trying to cite/support this section is a nightmare right now There are many informal definitions scattered throughout this and other sections, but they are not explicitly labeled, nor their merits discussed. Aside from there surely being some high quality definitions out there to cite, this has lead, in my opinion, to citations that attempt to prove the *validity* of a definition rather than cite its usage or source. For example, one citation simply stated "x language is not a programming language" as a way to show that some authors use the term "computer languages for things that aren't considered programming languages" (or specifically give an example of a term it might be used for) - and while this does demonstrate that there are "languages that some authors don't consider programming languages," the author was not asserting that it was a "computer language" or referencing the term (at least not that I could find). To demonstrate why this makes no sense, I could just as easily cite that as evidence that people use the term L33TCOD3 to describe things that aren't considered programming languages. And that's not to say the validity of the definitions shouldn't be discussed - merely that it should be done so explicitly.
3) Informal definitions fall into tidy but distinct categories There are several "batches" of informal definitions, each with their own utility/limitations, and these are worth discussing in contrast to more formal/historical definitions. These are the groups as I see them, and some of the utility they lack vs the more formal definitions - I believe people tend to informally mix/match these (and obviously in a full writeup I would cite sources):
Programming Languages Are Compiled- many people will use compilation as a standard for whether or not something is a "real" programming language, usually arguing that an interpreter makes it a "scripting" language. This definition does not to do a good job describing virtual-machine based language implementations
Programming Languages Must be Turing Complete - the reason people think this is obvious, but an important misconception is that this test is meant more generally for instruction/mathematical operation sets and is just able to be applied to programming languages as a result. Like much of Turing's work, it was just so solid that we can continue to use it for things he never even got to see, and programming languages inherently overlap with instructional sets by design - but at the end of the day Turing Completeness just means being able to accomplish the same things as a certain class of machine he made up (not to diminish the concept, it's a really important machine that does a lot).
Programming Languages must be imperative/have logic/control structures - this perception often arises due to the popularity and power of imperative programming languages - however as the name suggests (and this article even mentions several times), that's formally a *subset* of programming languages. Declarative programming often "feels" less like programming because it is not as concerned with how the task is going to be accomplished, but the original distinction of instructing a machine with language was not concerned with that sort of distinction.
I'm open to any feedback/discussion on this - I'm relatively new to editing here, but this is my field and I am happy to elaborate on or support anything that seems subjective/biased (or just own if its a bias ). I really think this would both represent these viewpoints more clearly, and contextualize why people might have differing ones "in the wild"Theaceofthespade (talk)18:28, 2 March 2025 (UTC)[reply]
WRT "A programming language is a system of notation for writing computer programs."
Natural language is writing books. ...And a whole lot of other stuff.
Also, it's not just about writing. I couldgenerate and that's not what I'd call writing. So, maybe 'authoring' is better word.
Thing is, it's not wrong but it's not accurate and it's grandiose. I can use a programming language to write a fragment or code that's never compiled into a program or compiled into a library. A language defines the rules for writing source code. Maybe less sexy, but that's all it is. If you love the 'notation' word, then: a programming language is notational system for encoding the control of a computer.Stevebroshar (talk)02:12, 9 July 2025 (UTC)[reply]
WRT "An implementation of a programming language is required in order to execute programs"
A compiler/interpreter is not an 'implementation' of a language. A compiler/interpreter understands and conforms to a language.Stevebroshar (talk)02:18, 9 July 2025 (UTC)[reply]
I see the lead says a programming language is an "engineered language", changed from "artificial language".
I'm not sure about this since far as I understand anengineered language refers toconlangs likeLojban which are designed (at least theoretically) as human languages.
I'm not convinced "artificial language" is the right term either though.Formal language might be closer? That's what's used over oncomputer language. Or maybe there is some source which could provide an answer?Squoop (talk)22:06, 28 January 2026 (UTC)[reply]