Untitled

edit

in computer ,how to covert highlevel language to machine language

With a compiler. --Mike Van Emmerik 21:20, 26 October 2005 (UTC)Reply

Untitled 2

edit

It's not clear to me: is it the concensus that "machine language" is the same thing as "machine code"? Or is "machine language" a bit more like a grammar, and machine code only like "sentences" (programs or modules) expressed in that language? Or perhaps the language is a bit like an enum: you could talk about the Z80 language or the MIPs language, so while there is one Z80 language, there are many Z80 machine codes (compiled or assembled Z80 programs)? I think it would be good to spell this out in the article, which seems to use the two terms more or less interchangeably. --Mike Van Emmerik 21:20, 26 October 2005 (UTC)Reply

Also, is there a consensus that instruction set is the same thing as "machine language"? The terminology makes it sound analogous to several natural languages being written out in some character set. But when someone talks about 2 different "machine languages", that always means he's talking about 2 different "instruction set"s, in my experience. I often hear the phrase "written in machine language", usually meaning that some person typed in assembly language). When I hear "machine code", the speaker is usually pointing out a block of hexadecimal numbers generated by a compiler or an assembler. Sometimes I hear "some machine code" or "the machine code for this program", so I think you are right. It's analogous to "some English text" or "the English text for this document". But "there are many Z80 machine codes" doesn't sound quite right to my ears. "There is a lot of Z80 machine code" sounds better -- I wish I could put a finger on exactly why. -- User:DavidCary --70.189.75.148 06:13, 5 February 2006 (UTC)Reply

Assembly language vs. symbolic machine language

edit

I am in doubt if these are the same. I believe that assembly language is the language actually used for coding to the assembler. On the other hand you do not code in symbolic machine language but use it for examining code, ie. instead of reviewing the assembler output as pure hexidecimal, you can (for learning purposes) write it in symbolic machine code, where at least all the opcodes are replaced by a mnemonics. Symbolic machine code is not mentioned in neither Machine code or Assembly language. Velle 17:46, 23 March 2006 (UTC)Reply

Revision of 8th November

edit

This revert by Karol appears to have been to a much earlier version, and happened to overlap with Tobias' reversion. --Mike Van Emmerik 22:27, 7 November 2005 (UTC)Reply

Amended

edit

I added some text at the beginning. --VKokielov 04:06, 31 May 2007 (UTC)Reply

...and then elsewhere, and then moved it around and trimmed it and... eventually ended up with the "magazine clipping" paragraph back in basically the same place, in exactly the same words, but without the second half. (In fairness, the second half was phrased in more subjective value-laden terms than the first half, but even the first half feels like opinion or a pet idea.)
It is sometimes perceived that machine languages are more fundamental than other computer programming languages.
Citation needed.
They are not; the power of a programming language has been shown to depend on the power of the underlying machine; see Turing machine.
Okay, but you're using the word "fundamental" in two different senses, then. Machine languages may not be more Turing-complete (any more than someone can be "more pregnant"), but they are more fundamental, more basic, precisely because they form the basis and fundament of modern computer programming. These days, you practically can't have a computer without that computer using some kind of machine language. (Obviously any machine has a "machine language" in the most general sense, but I mean "machine language" as described in this article: opcodes and registers and memory addressing modes, things that look familiar to anyone who knows any machine language.)
The structure of machine languages is a consequence of the necessity of simple elements in the electronic design of computers.
Now this is an interesting thesis. Can you support it? In particular, what is the structure of machine languages? I've given some common themes above (registers, opcodes, addressing modes), and I might add prefix-free coding; but none of those seem to be theoretically essential to computer architecture. And then, some symbolic machine languages (TI's C6000 DSP comes to mind) end up looking much more daunting and quirky than their actual architectures; so are you talking about languages or architectures, or can we consider the two concepts synonymous?
And once the structure of machine languages has been elucidated, it would be nice to list exactly what "simple elements" of architecture design you're talking about; and then how those elements necessitate that particular structure.
I generally agree with your theses (although I feel you're being tricky with sentence 2), but I don't think you can support them yourself, and I don't know of any definitive works on the subject, so I wouldn't even be able to say, "Seymour Cray says that the structure of machine languages is..." "The Structure of Machine Languages" sounds like a really interesting historical survey project, though. Any PhD candidates looking for a thesis topic? ;) --Quuxplusone 05:19, 31 May 2007 (UTC)Reply
You win.  ;) --VKokielov 10:45, 31 May 2007 (UTC)Reply

Differences

edit

Are programs that need a kernel to run in machine code, or is it an OS specific format? --Doomguy0505 10:29, 10 November 2007 (UTC)Reply

What it used for

edit

the article rake about almost everything except the use of Machine language or Machine Code and who is programing by it .is the compiler programed By Machine Code or part of it ..or Some Part of OS are programed by machine code etc .Salem F (talk) 23:29, 12 October 2009 (UTC)Reply

Sorry Salem F I do not quite understand what you are saying here. Were you asking a question? Can you please try saying it with different words? --220.101.28.25 (talk) 18:45, 26 October 2009 (UTC)Reply
yes, I do not understand the question, either. Not many people program with machine code unless they have masochistic tendencies. So, I think Salem must have meant to ask something else. stmrlbs|talk 00:59, 27 October 2009 (UTC)Reply
  • I think my question was clear ..I'll say it by another way ..

can we have more example of Programs that's wrote by machine code. i know some programs had wrote by machine code on Old days and now or Assemly Language take it place for ever (see the use of Assemply language) .........--Salem F (talk) 20:44, 29 October 2009 (UTC)Reply

If the question is "who writes directly in machine code, rather than in assembly language or a higher-level language?", the answer is "almost nobody" - even for low-level machine-dependent parts of OS code, that's almost always written in assembly language, as most programmers don't have, as per the earlier comment, masochistic tendencies so strong that they aren't even willing to use an assembler.
Some software writes machine code, such as assemblers, compilers that directly generate machine code rather than generating assembly code and handing it to an assembler, just-in-time compilers, and so on. Guy Harris (talk) 21:42, 10 July 2024 (UTC)Reply

Differences between "byte" code and "machine" code

edit

There isn't any. The only difference is that machines that understand "byte" code is often implemented in software, while machines that understand "machine" code is often implemented in hardware, but there is no reason why a "byte" code machine cannot be implemented in hardware or a "machine" code machine in software. In fact, there are plenty of example of both.

I suggest the two articles (this one and bytecode be merged and this point be clarified. 13:23, 17 June 2010 (UTC) —Preceding unsigned comment added by FrederikHertzum (talkcontribs)

Strongly oppose merging in the proposed way, at least because bytecode has 18 interlanguage links to articles (it means that in yet 18 languages these topics are separated). Should bytecode be considered as a special case of machine code or it should not, of course, is disputable. And there is a confusion between bytecode as a concept and Java bytecode, though. Incnis Mrsi (talk) 14:36, 20 June 2010 (UTC)Reply
I don't believe I have made any suggestions as to how this should be merged (although I do see your point). Java bytecode is simply one machine language, which is used in the Java machine and as such is a type of "bytecode" or machine code. That there is no technical difference between the terms should at least be clarified in both articles, if they are not merged. 80.167.145.223 (talk) 03:24, 21 June 2010 (UTC)Reply
I don't see how Java bytecode, which is machine independent, can be mistaken for machine code - which is obviously machine (ie, processor) dependent. Java bytecode is interpreted by a Java Virtual Machine. Machine code is interpreted by a processor. —Preceding unsigned comment added by 121.214.29.70 (talk) 15:45, 3 July 2010 (UTC)Reply

The edit by user:Beland

edit

[1] what was wrong with two distinct sections about two opposite data transformations? Incnis Mrsi (talk) 17:27, 24 January 2013 (UTC)Reply

The edit of 174.94.3.167

edit

Although definitely a good-faith edit, I opted to remove it because of

which is, at best, ambiguous. I would say that it is a rubbish, because any executable program represents a “use of machine code”. Incnis Mrsi (talk) 07:20, 25 May 2013 (UTC)Reply

Yes, there is much confusion here. The last sentence of the intro, by using the word "typically", leaves the impression that a hardware processor may not need machine language to operate. And the first paragraph after that should say "electronic" rather than "physical" design.74.76.137.34 (talk) 16:34, 13 September 2015 (UTC)Reply

I'm not sure I see the confusion here - the last sentence says that the interpreter is typically machine code, which is certainly true. But there is nothing at all preventing someone from writing an interpreter in an interpreted language (in fact it's happened often), but performance will usually be poor. Rwessel (talk) 04:11, 14 September 2015 (UTC)Reply
And as regards physical vs. electronic, electronics are the most common way to implement a CPU, but is hardly a requirement. Pneumatic and hydraulic "logic circuits" certainly exist (and are used in mechanic systems to control operation of devices), and one could, in principle, build a computer out of such things. Babbage's Analytical Engine, for example, had it been built, would have been entirely mechanical. Rwessel (talk) 04:17, 14 September 2015 (UTC)Reply

Relevance of Berkley Law professor's opinion on human readability

edit

The question is in the title.
Do you think that computing people must agree with the law professor's opinion? FelixHx (talk) 19:03, 17 May 2024 (UTC)Reply

Changed link from Computer code article to computer program article

edit

Regarding this edit, I misspoke in my comment. My comment should say, "Computer code article is now redirected to Source code article, which doesn't apply here." Timhowardriley (talk) 20:53, 30 June 2024 (UTC)Reply

The role of auxiliary files in Machine code#Readability by humans

edit

The IBM High Level Assembler (HLASM) has an ADATA option directing it to produce an Associated data file output, containg data describing the contents of both the source and object files. The available debuggers for, e.g., z/OS, have the ability to display the source line corresponding to an instruction of interest. However, the ADATA file itself is not human friendly. Should Machine code § Readability by human mention it? -- Shmuel (Seymour J.) Metz Username:Chatul (talk) 12:46, 8 July 2024 (UTC)Reply

That sounds like a separate "debug symbols" section. Windows also uses them[1] in separate .pdb files.[2] Most UN*Xes put debug symbols into the symbol table of an object or executable file, in a format such as stabs or DWARF, but Apple's OSes also have the notion of putting debugging symbols (in DWARF format) into a separate .dSYM file.
I don't think any UN*X assemblers can generate debug symbols, except with explicit pseudo-ops used by compilers that generate assembler code and rely on the assembler to produce object files, as assembly-language programming is rare on UN*Xes. I don't know whether Windows assemblers do; in 16-bit Windows, (x86) assembly-language programming may have been common, but I suspect assembly-language programming on Windows became less common over time. A lot more is probably done on OS/360's successors, even now, so having the assembler generate debug symbols may be more useful.
It looks as if ADATA files (which can be generated by language processors other than HLASM) can also contain a copy of the source code, at least for assembly language.[3] UN*X and Windows debuggers assume you have the source code handy, and just have debug symbols to associate instructions, or ranges of same, to a particular line in a particular source file.
So mentioning debug symbols in this context might be informative, but it shouldn't assume that they're in a separate file, and the details can be left to the debug symbols page. Guy Harris (talk) 19:39, 8 July 2024 (UTC)Reply
@Guy Harris: I've updated Machine code and Debug symbol to include information on ADATA. How much additional detail should I include, e.g., link to format? Is anyone willing to add information on other formats? -- Shmuel (Seymour J.) Metz Username:Chatul (talk) 12:55, 10 July 2024 (UTC)Reply
@Chatul: For machine code, explicitly mentioning that ADATA files can include source code, with a link to the format given as a reference, would probably suffice, as the topic of that article is machine code, not debug symbols.
For debug symbol, links to the ADATA format (and whatever format the TEST option used) would be useful. For other formats, I'm a bit amazed that the word "dwarf", regardless of capitalization, appears nowhere in the article; that needs to be fixed. (Same with stabs.) Information on whatever format Microsoft uses, or a link to a page about that, should probably also be added. Guy Harris (talk) 20:54, 10 July 2024 (UTC)Reply
Tools and methods to make machine code readable would be interesting. However, the current first sentence seems like a debate about whether or not to patent machine code. It isn't interesting. It is also a run-on sentence. Timhowardriley (talk) 22:12, 8 July 2024 (UTC)Reply
See also § Relevance of Berkley Law professor's opinion on human readability. Guy Harris (talk) 21:37, 10 July 2024 (UTC)Reply

References

  1. ^ "Symbols for Windows debugging". Microsoft Learn.
  2. ^ "Querying the .Pdb File". Microsoft Learn.
  3. ^ "Associated Data Architecture". High Level Assembler and Toolkit Feature.

Unexplained removal of text

edit

@Timhowardriley: Edit special:permalink/1245774947 removed the paragraph Early CPUs had specific machine code that might break backward compatibility with each new CPU released. The notion of an instruction set architecture (ISA) defines and specifies the behavior and encoding in memory of the instruction set of the system, without specifying its exact implementation. This acts as an abstraction layer, enabling compatibility within the same family of CPUs, so that machine code written or generated according to the ISA for the family will run on all CPUs in the family, including future CPUs. I believe that the first sentence is relevant and should be restored, possibly with different wording. -- Shmuel (Seymour J.) Metz Username:Chatul (talk) 15:29, 15 September 2024 (UTC)Reply

This edit is the one you're thinking of. I think that information should be mentioned, but I'm not sure it needs to be mentioned in the lead. Guy Harris (talk) 18:54, 15 September 2024 (UTC)Reply
The topic sentence in the paragraph talks about non-backward compatibility. However, X86#History says, "Many additions and extensions have been added to the original x86 instruction set over the years, almost consistently with full backward compatibility." Since the x86 series started in 1978, the "early" backward compatibility problem ended 46 years ago. Also, the backward compatibility problem doesn't help describe what machine code is. The second sentence in the paragraph changes the topic to ISA. Continuity is missing. The first time I read the article, I stopped reading here. Timhowardriley (talk) 19:38, 15 September 2024 (UTC)Reply

Is bytecode interpreted to machine code?

edit

The last sentence in the lead says that bytecode is interpreted ... to the host computer's machine code. I think the verb phrase "interpreted to" doesn't apply here. My understanding is the application has a bytecode interpreter that executes the instructions. Other articles call the bytecode interpreter a "virtual machine". I think the article should spell this out. However, this is a lot of detail for the reader to digest. If a change is made, then the entire paragraph should be moved from the lead to a new section. Timhowardriley (talk) 20:12, 15 September 2024 (UTC)Reply

The last sentence in the lead says "Bytecode is then either interpreted or compiled to the host computer's machine code." This can either be interpreted as "Bytecode is then either {interpreted} or {compiled to the host computer's machine code}." or as "Bytecode is then either {interpreted or compiled} to the host computer's machine code." The former is what is intended.
"Bytecode either is interpreted or is compiled to the host computer's machine code." might be less ambiguous. Guy Harris (talk) 20:22, 15 September 2024 (UTC)Reply
Okay. It seems like you agree with my understanding that interpreting bytecode requires another program that may or may not produce machine code. For example, the Java virtual machine is a java bytecode interpreter and has implementations that don't produce machine code. To produce machine code, the virtual machine must execute a just-in-time compiler. But what I just said applies to the Java language only. I'm sure other languages have different methods to compile or not compile their bytecodes to machine code. The nuanced fact that interpreting bytecode may not produce machine code makes me think that bytecode shouldn't be mentioned in the lead. Timhowardriley (talk) 00:22, 16 September 2024 (UTC)Reply
It seems like you agree with my understanding that interpreting bytecode requires another program that may or may not produce machine code. I agree with your understanding because that was my intent when I wrote that, even before you expressed that understanding. :-)
(Although machines can be built whose machine code is some language's bytecode.)
Some languages that are translated to bytecode may have implementations that never translate that bytecode to machine code, and some may even have no implementations that do translate bytecode to machine code. Java's not at all special in that regard.
And, yes, perhaps the lead - and perhaps the article as a whole - should leave discussion of interpreted languages to interpreter (computing) and discussions of bytecode to bytecode. JIT compilation is just another form of compilation, so it is arguably covered by "A high-level program may be translated into machine code by a compiler." Guy Harris (talk) 08:59, 16 September 2024 (UTC)Reply
Just insert a comma in Bytecode is then either interpreted or compiled to the host computer's machine code. and there will be no ambiguity. -- Shmuel (Seymour J.) Metz Username:Chatul (talk) 07:41, 16 September 2024 (UTC)Reply
However, there are two issues. 1) Remove the ambiguity. 2) Introduce the reader to machine code without confusion. When introducing, extraneous information is confusing. The fact that bytecode can be interpreted is extraneous to machine code. Timhowardriley (talk) 08:54, 16 September 2024 (UTC)Reply

machine code, machine language and instruction

edit

I feel that the article still lacks a clear definition of machine language, machine code and instruction, and I hope someone can improve it. In order to explain, I have added a image.

This image was inspired by Digital Design and Computer Architecture

Machine code#/media/File:Machine language and assembly language.jpg ShiinaKaze (talk) 15:02, 24 September 2024 (UTC)Reply