Author Topic: Someone is claiming EZflash ODE is 'emulation' and Everdrive is 'hardware'.  (Read 1253 times)

0 Members and 1 Guest are viewing this topic.

Offline Galron

  • Hero Member
  • *****
  • Posts: 802
  • Karma: +15/-0
    • View Profile
Besides trying to avoid confusing people, and oversimplifying terminology.... Goomba is 'emulation', its an 'emulator' that runs on GBA hardware, NES emulators for GBA are 'emulation'.

Totally agree.

The idea of flashcarts is for hardware to run cartridge as if its actual original hardware. Without an 'emulator'...

I understand what you mean, but no. The cartridge is emulated, and everything else is obviously original.

I think there is already something particular in the case of FPGAs in that they serve as both the host system and the emulated system at the same time, while we all have been more accustomed to distinguishing one entity from the other being separate, with the use of software emulators running on x86 hosts, and less to layered configurations.

Goomba is a software emulator, accessed from a hardware emulated cartridge, executed on a ARM host.

Virtual console is a software emulator, that can run in a software emulated Wii, executed on a x86 host.

Parallel has a more sophisticated processing technique : cycle-accurate software rendering emulator, avoiding API calls, translated to shader language, executed on GPU, hardware accelerated.

So, when the seller opposes emulation and hardware, I see that as an inapropriate way to say that one product is closer to real hardware (likely but should be very minor then) and that the other is closer to software emulation (not true).

To a certain degree I understand what you are saying, but just using the term 'emulation' is often oversimplification without further explanation (or descriptive nouns)... If you are talking about 'emulation' (the term by itself) the first thing people think of traditionally is software emulation, which was 90% of all emulation out there, except in recent years... As in 'software emulators' examples being stuff like ZSNES, VisualGBA, Nesticle, etc...

The other form is FPGA being mapped out to function like original 'chips', if you call it emulation, at least specify that its "hardware emulation'.... Yes, not everyone calls it 'emulation'... Some call it 'hardware simulation', Near, Analogue, etc. But that's not really the point of this thread...

But simply saying "emulation" by itself and comparing one device that uses FPGA to mimic chips vs emulators that designed to 'run software' (without need of the original hardware) (Goomba, etc) just makes things confusing... It makes one advertising the devices of essentially the same technology and purpose, trying to make one seem like it functions completely differently than the other, is misleading, bordering on false advertisement.

Of course there is third/fourth option, 'recycled chips' or 'clone chips'... which are one way of creating support for running games, but this is not viable in a flashcart that has to be rewritten to run a number of various mappers.

Offline Nemok

  • Jr. Member
  • **
  • Posts: 91
  • Karma: +0/-0
    • View Profile
I gave you the definitions I had for the terms emulation and simulation, and stayed consistent with them.

Now I'd really like to know the ones you have.

Offline Galron

  • Hero Member
  • *****
  • Posts: 802
  • Karma: +15/-0
    • View Profile
Mostly as I understand it... are the terms as used by Near/Byuu..., and by analogue/mister community.

One can call it 'low-level' vs. "high-level' or "hardware level" vs "software level".... But calling it 'emulation' alone without any descriptive qualifiers to differentiate the various types, methods and intent, just makes any discussing over simplified and confusing.

"low to medium' accuracy is usually software based emulation the least accurate method, but this is basically 'goomba'. It takes least amount of processing power, because its self-contained, and does least amount of work to get roms working.

'High accuracy' to 'cycle-accurate' methods are something like Byuu-Higan emulator... It's not simply trying to 'trick software ("the roms") into running, outside of their native environment, but rather, the emulator creates an environment that 'simulates' the original hardware environment, while hardware environment runs the roms, effectively simulating 'hardware' level. This takes a lot of processing power because its a computer running within a computer, its closer to 'virtualization' on some level. But simply put this is a high-end 'software-based emulation' not just emulating the roms, but also the original operating system/hardware in a virtual environment.

virtualization'

virtualization generally also involves running the underlying operating system as accurate as possible to the original operating system loading/based on... This is basically what virtual pc within windows is... or running windows within dosbox.
http://teraknorblogs.blogspot.com/2012/01/simulation-vs-emulation-vs.html


Simulation tends to be similar to virtualization, but generally refers to hardware... specifically "Chip accuracy".... This generally requires an FPGA to be 'rewritten' to run as original chip being 'simulated', or a 'clone chip' that tries to have all the original code/data, but can't be rewritten/updated (the latter is going to be only as good as the code, where as FPGA can be 'fixed' with firmware updates).


https://emulation.gametechwiki.com/index.php/Emulation_Accuracy

Quote
"Chip accuracy
By simulating each logic chip on the board individually, this not only takes a tremendous amount of processing power or specialized hardware to run (as in, even emulating something from the 1970s on a chip accurate level would need AAA-level system requirements to run at a good speed), but it also requires an incredible amount of effort to make, and it's also almost useless. Although it is the only way to achieve true hardware simulation, cycle accurate emulation can already achieve virtually indistinguishable accuracy from the real hardware, aside from a very negligible set of edge cases. In addition, cycle-accurate emulators have much lower system requirements and programming difficulty. The only chip accurate emulators that are currently usable run on Field Programmable Gate Arrays, or FPGAs, which are essentially custom programmable chips. Machines dedicated to this type of emulation exist, such as the Analogue NT Mini by kevtris or the RetroUSB AVS by bunnyboy. Other examples of chip accurate emulation can be found in flash carts such as the SD2SNES, where various add-on chips are emulated on the included FPGA. "

"simulation" is more about the 'environment' than the software its mimicking.  As in its trying to be self-contained environment simulating the original environment its based on. But that means simulation can both be under 'low-level',. 'high accuracy' or 'chip accurate' 'emulation'. Which are generally forms of 'hardware based emulation'.

Between high-level (software), and low-level (hardware) the latter can more accurately be called 'hardware-based' (which includes rewriting the chips, or using a clone chip with the code already incorporated into it) while software based relies on native components of the system its using, and creating a completley software-based environment (basic emulator).
 

Whereas hardware-based methods either simulate an entire environment (a system within a system within a system: Computer level, emulator level, and a simulated hardware level environment within the emulator). This power hungry as mentioned... FPGA based 'chip-accuracy' based system tend to strip out the 'emulator' level (since its the hardware (original OS)'s job to do most of the work), while the FPGA, or clone chips represent the original 'hardware' they are mimicking on the cartridge level.

Of course something like Mister/Analogue products, aren't trying to represent/rebuild chip-accurate cartridge, but cycle-accurate/chip-accurate version of the entire system itself (along with a lot of new features added to it for QoL), running on 'modern hardware' through FPGA. FPGA essentially becomes the new 'cpu'/other chips, and can be rewritten to work as accurately as possible to the original chip. There are also clone-based consoles that use clone chips, similar to FPGA but can't be rewritten... But its also another hardware based method...

https://www.polygon.com/2018/2/7/16934180/super-nt-review-super-nintendo-snes-analogue

Quote
"Because it’s not a software emulator — meaning, the hardware inside the Super NT isn’t running an operating system that is in turn running software to emulate the Super Nintendo — the Super NT has virtually no lag and has, at least in theory, perfect accuracy across all Super Nintendo and Super Famicom cartridges."

"Not being an software emulator also means that the Super NT doesn’t have some of the modern comforts afforded by emulator-based platforms. Quick x-second rewind? Nope, not here. A save state system? No."



Finally there are the clones that use purely software methods... These are generally incompatible with flashcarts, they can't 'interface' the same way as original hardware... These will rip a rom from a cartridge and then use them in software based emulator. but it doesn't actually use any original function from the cartridges themselves.
« Last Edit: October 12, 2021, 12:56 PM by Galron »

Offline Nemok

  • Jr. Member
  • **
  • Posts: 91
  • Karma: +0/-0
    • View Profile
According to gametechwiki, same source

"While it may be theoretically possible to have a 100% perfect emulator, that feat is very rare (if not nearly impossible), even for some highly regarded emulators such as higan or kevtris's work on the various FPGA-based consoles by Analogue."


According to teraknorblogs, same source

"Emulation : it pretends to be the environment, such as a gaming console, where the 'code' in the guise of ripped ROMs are tricked into thinking that they are running on their host platform. Emu's run like the real thing, but are not. There tends to be no interaction between the emulator and the underlying system, where the emu is effectively the application."

"Then there is simulation, this is not the real system in any shape or form but has the potential to create an environment where the novice can practice different skills in a 'safe enclosure'. Flight simulators, are clearly not the real thing but offer an affordance that can prepare pilots and spotty teens for the real experience."

Offline Galron

  • Hero Member
  • *****
  • Posts: 802
  • Karma: +15/-0
    • View Profile
The issue with the term 'simulate/simulation' is that there is no specific definition for it it is used in different contexts...

The problem with emulation/simulation/virtualization, and other terms there is no single standard, everyone uses them in different ways, and/or overlapping ways as well... Unless someone is more specific to bring up 'low-level'/high level'/'cycle-accurate/'chip accurate'/hardware/software, etc then most people default to the idea of basic idea of emulation which are generally low-medium level and don't bother to try to copy the functions of the hardware they are based on.

So I can't give you 'definitive' definition... There isn't one...

Any attempt to simply describe them as 'emulation' as a single comprehensive category either oversimplifies the technology being used, or it conflates and confounds what they do in confusing way. One in which the discussion is meaninglessness.

But here is examples of how 'simulate' is not always about "flight sims' and flight sims obviously aren't 'emulation'... But some flight sims are obviously 'emulated'...

Quote
https://arstechnica.com/gaming/2021/06/how-snes-emulators-got-a-few-pixels-from-complete-perfection/

 For a truly perfect emulator, just making ~3,500 commercially released SNES games playable isn't enough. Every function of the system has to be simulated with cycle-perfect accuracy, too.

Quote
https://www.geeksforgeeks.org/difference-between-emulation-and-simulation/

2. Simulation :
Simulation, as name suggests, is a technique that helps computer to run certain programs developed for different OS. It usually provides essential method of analysis that is easily communication, verified and understood. Its main purpose is to shed light on mechanism that usually control behavior of system. It is totally computer based and uses algorithms and equations.

Quote
https://www.digitaltrends.com/gaming/best-snes-emulators/

Many of the most popular SNES emulators began development during the late-1990s. Because of the lack of computational power, these emulators tended to focus on High-Level Emulation (HLE), which tries to simulate the response of a system efficiently but doesn’t attempt perfect accuracy.

Even the idea of "hardware emulation" can potentially have more than one meaning...
Quote
https://www.electronicdesign.com/technologies/eda/article/21801171/11-myths-about-hardware-emulation

Another term that gets tossed around "replication"...

Quote
https://www.reddit.com/r/AnaloguePocket/comments/i3sy32/fpga_emulation_vs_the_real_thing/

So technically an FPGA is emulation but not the emulation you are used to. FPGA is basically hardware replication. It uses the same chip logic as an original Gameboy do theoretically it IS a Gameboy. Software emulation on the other hand is software that acts as the hardware. It's usually not as accurate as an FPGA implementation.

The Pocket (and the other contemporary Analogue consoles) use an FPGA to re-create the logic flow of the original hardware. If this sounds like emulation, it is, but different from the emulation you see on Android, PC, etc. Note that FPGAs are used for more than emulation but I'll delve into this from an emulation perspective.

Emulators are software emulation. Code is written to mimic what the original hardware does, then is compiled to machine code for the target platform, and the program is executed to produce a similar output to what you would expect on a real console. With this approach, it's easier to code for, but you're taking instructions for one piece of hardware and translating them to run on another piece of hardware. There is oftentimes significant overhead in translating these calls, and in some cases (like with n64 emulation) it's nigh impossible to get it 100% accurate and also run it at a playable speed. So you wind up with games that either run super slow, or games that make concessions to become playable. There are some systems that are accurately emulated but require beefier hardware to run accurately at a playable speed.

FPGAs however are different. Standing for Field-Programmable Gate Array, FPGAs are basically programmable hardware. Normally, hardware is immutable, meaning you can't change existing hardware without making physical changes to it. But in the FPGA model you write code to do what the original hardware does, but instead of compiling to machine code and executing a program, you "compile" the FPGA instead. This is basically mutable hardware that can be changed based on what the program asks of it and hardware is far more performant than software is. This would be classified as hardware emulation since with an FPGA, as long as it is large enough to hold the complete instructions for an embedded system, something like the Gameboy or SNES can be reverse engineered and be recreated on an FPGA in a near 1:1 fashion. This results in a combination of fast and accurate emulation while needing less resources to achieve the same result from a software emulator.

I'm a software engineer and I don't work with FPGAs myself, so some of the FPGA development details may be off. But this is the general difference between software emulators and hardware emulation with an FPGA along with a high level overview as to why FPGA emulation can be more accurate and performant than software emulation."

Quote
https://samagame.com/en/when-emulation-is-not-enough-mister-fpga-is-the-project-that-simulates-all-kinds-of-classic-hardware-machines/

When emulation is not enough: MiSTer FPGA is the project that simulates all kinds of classic hardware machines...

the the world of emulation is stupendous and it allows us to continue to enjoy mythical machines and games that would otherwise probably be lost in memory.

However, software emulation is not always sufficient, and that’s where MiSTer comes in., a project that transformed the board of directors Terasic DE10-Nano an absolute wonder for those who want to enjoy the platforms and their content as they were designed. This is not a software emulation, but a hardware simulation, and your loyalty to these architectures and games is absolute.

Of course 'software replication vs hardware replication' is a thing too...

Quote
https://searchdatabackup.techtarget.com/tip/The-differences-between-hardware-replication-and-software-replication
« Last Edit: October 12, 2021, 05:35 AM by Galron »

Offline Galron

  • Hero Member
  • *****
  • Posts: 802
  • Karma: +15/-0
    • View Profile
There simply are no 'standards' in the terminology... Read more than one book, source, expert, and they each give different or overlapping definitions. Sometimes using similar terms to mean very different processes. That is that they give more than one definition/usage for the same term/word.



Quote
https://dfarq.homeip.net/fpga-vs-retro-hardware/


A purist will object to modern FPGA approaches, usually for more than one reason. But there can be practical advantages to an FPGA solution, and it’s also possible to blend it with a more traditional approach.

FPGA is not the same as emulation
FPGA vs retro hardware
An FPGA can be reprogrammed to implement vintage computer chip designs on new hardware, giving you old chips without the reliability and scarcity issues.

The first thing to get out of the way is the question of FGPA vs emulation. FPGA adherents stress they aren’t the same thing. The difference may or may not be enough for a purist. But there’s enough difference that an FPGA approach is good enough for a class of hobbyists who aren’t happy with the results of emulation.


Quote
http://my-cool-projects.blogspot.com/2016/03/emulation-vs-fpga.html

I've been hearing people talk about "one to one" (ie 1:1) or "not emulation" when talking about FPGAs.  What do they mean when they says this?

Now, what might people mean when they say "not emulation" ?

CPU emulators, such as what one may find in MAME, take a few shortcuts in order to achieve decent performance.  They do not emulate the clock of the CPU, but instead are designed to execute a variable number of cycles in one shot as quickly as the host machine (ie a modern x64 computer) can execute.  The code that is driving this execution is then responsible to regulate the overall speed of the system so that it does not run too quickly.

For example, let's say we are emulating a 1 MHz Z80 cpu.  The cpu management code may tell a z80 emulator to execute 1000 cycles which would take 1 ms on original hardware.  The z80 emulator would then go execute these 1000 cycles as fast as possible and report back how many cycles were actually emulated (it might be more than 1000 because some instructions take more than 1 cycle).  The cpu management code would then have to stall until the 1 ms period has completed before executing the next chunk of cycles.

This creates an unauthentic experience because it means that instead of instructions being executed at a steady slower cadence, they are executed in quick bursts with delays in between.  This is usually not noticeable by a human playing the emulated game because it's just 1 ms, but several problems can arise depending on the other architecture of the original hardware.

On a game like Dragon's Lair where there is just one CPU and a steady clock that never varies, the above method of emulation is "good enough."  A human is not really going to notice any meaningful difference in accuracy.

But what of the game has multiple CPUs such as a dedicated sound CPU?  Now the emulator has to execute a smaller slice of cycles on the first CPU, then switch to the second CPU and execute another smaller slice of cycles.  If there are interactions between these two CPUs (and there usually will be), the slice of cycles that gets executed needs to be small enough so that there is no unnatural lag in the interactions which hurts performance of the overall system.  And even if each CPU takes turns executing just 1 cycle, the potential for an inaccurate interaction between the two emulated CPUs still exists since the emulator does not take into account the clock.

Now, what if the original hardware fiddles with the CPUs clock, or the clock is not constant for some reason?  The Williams games are notorious for doing this.  On a lot of the Williams games, like Joust, their custom DMA chip will actually halt the CPU while the DMA operation is running.  On Star Rider, the CPU's clock gets halted every field for unknown reasons (that's on my TODO list to figure out why).  Last time I checked, this behavior was not emulated very well in MAME (it may have improved since I last checked) and certainly Daphne is not equipped to handle this type of scenario.  However, an FPGA would be able to handle it just fine.

Now, does this mean that emulators like MAME and Daphne can't be improved to take into account a variable clock?  Not at all.  As modern computers get faster, it will become more feasible for emulators to become more accurate without hurting performance.  I believe that aside from the problems associated with running on a modern operating system (with many processes and threads all competing for the CPU's time), there is no reason why software-based emulators cannot achieve 100% accuracy as their architectures are improved.  However, I do not believe that that day is here... yet.

I hope that gives people a better idea of what I consider the difference to be between FPGA solutions and emulation solutions.

Quote
 
https://www.reddit.com/r/emulation/comments/7vjehj/fpgas_arent_magic_an_article_by_the_creator_of/

"FPGA implementations are emulation, too" isn't really an accurate statement, since they're not emulation, they're hardware clones (that is, re-making hardware in hardware is inherently different from re-making it in software). However, what I think byuu is getting at is 100% true: just because it's a hardware implementation doesn't mean it's accurate to the original hardware. Case in point: the various poor-quality NES, MD and SNES hardware clone systems that have been around forever. Those ASICs were made using the same techniques and HDL as an FPGA implementation would be, except they suck because the Chinese knockoff artists just slapped them together with little attention to quality.

The emulation/retrogaming scene associates FPGAs with extremely high accuracy because the only FPGA-based products released for us have been very high quality (RetroUSB's AVS and Analogue's Nt Mini), but the only reason these products are better than the shitty ASIC clones is that the authors worked hard on them and prioritized making a great product.

tl;dr: FPGAs aren't made of magic accuracy powder.

(If anyone is interested in an ELI5 analogy: you can think of it in terms of a physical object. That is, for any given object, I could make a 3D model in Blender, or I could make a copy of the object out of clay. Neither is an inherently more "accurate" representation of the original, and how close the output of either is to the original depends more on the creator's attention to detail than on the format itself.)

I don't know much about FPGAs, but if I understand it correctly they are chips with programmable behaviour, right?

Not really. The way to think of an FPGA is a giant matrix of logic gates, a giant matrix of wires linking them together, and programmable switches to connect the wires to the gates, and all of those connections are fixed once they're set. (At least until the entire unit is reprogrammed).

Its literally creating hardware -- you're defining what wires run where, and to what logic gates. You can create a turing-complete CPU core in a FPGA, but you don't have to. Every distinct grouping of electronics you define run literally simultaneously. Given early games were written with discrete electronics and multiple coprocessors. With emulation, you're creating interpreters and simulations of the hardware that run sequentially -- to the best ability of the CPU mimicking simultaneous operation. With an FPGA, you're literally running them simultaneously.

The primary difference is you don't do anything procedural in an FPGA. You don't program them. You just use a Hardware Definition Language to define what the electronics look like.

So, as an example, if you loaded a 6502 core into an FPGA to emulate an early Atari, you're not emulating a 6502, you're literally creating a 6502 clone in hardware. You're not saying "okay, if the opcode is this, then go do this, then go do that, then increment the program counter", you're saying "I need a set of logic gates that connect to these other gates, triggered on the clock, and based on those gates, I'll enable this other circuit".

Its completely different, but the FPGA languages, at a casual glance, can make them seem more similar than they actually are.

Under one definition of various terms, this moves us back towards "clone chips" which FPGA is also a subset of, being a programmable clone chip, vs one that is locked in one state.
Quote
https://bootleggames.fandom.com/wiki/Clone_consoles
Hardware vs Emulation
Clones can play games in two ways: through hardware or emulation. Hardware clones, like all official consoles, will work better than emulation-based consoles, so if the official console can play the game, the clone should too. However, even hardware clones are often not completely compatible with all games; many recent clones, particularly Famiclones and Mega Drive clones, incorporate the entire system into a single chip. This is smaller, cheaper to produce and uses less power than a traditional clone, but is less compatible.

Emulation-based consoles are not as reliable; furthermore, like in computer emulators, the game might not properly work, if at all (e.g. Somari's ending). However, emulation-based clones are comparatively rare, especially of the Famicom, as NOAC technology is so widespread - they appear to be most common for the GBA and Mega Drive. Many emulation-based Mega Drive consoles are produced by AtGames under official license from Sega, but are still generally considered clones as they do not use Sega's original hardware designs and suffer from compatibility and emulation quality issues.
« Last Edit: October 12, 2021, 05:47 AM by Galron »

Offline Nemok

  • Jr. Member
  • **
  • Posts: 91
  • Karma: +0/-0
    • View Profile
There simply are no 'standards' in the terminology... Read more than one book, source, expert, and they each give different or overlapping definitions.

According to kevtris, retrorgb interview
He explains he's been using FPGAs to make cartridge emulators.
https://youtu.be/Px3sNvfRZ7s (starting at 31'10'')

3 expert sources, 1 definition.

Offline nuu

  • Hero Member
  • *****
  • Posts: 2338
  • Karma: +99/-2
    • View Profile
Simulation is in my experience more often defined as Nemok described. Like those Game & Watch simulations that this guy made long ago, it was just computer programs that looks and feels like Game & Watch games without doing any kind of emulation at all. Now when many Game & Watch systems have been decapped, studied and got their ROMs dumped, accurate emulation has become possible in Mame.

The low-level and high-level definition looks strange to me. Usually low-level emulation is when the system is emulated down to a low hardware level, which normally means more accurate emulation and high-level means a high abstraction level and more inaccurate emulation.

I think it's necessary to use separate terms for "software based emulation" and "hardware based emulation", otherwise this discussion wouldn't have happened, and it comes up all the time. The word "based" can't be skipped either because "software emulation" sounds like it's the software that's emulated, both types of emulations are hardware emulation (they both emulate hardware not software). Or something more definitive like "MPU (microprocessor unit) based emulation" and "PLD (programmable logic device) based emulation".

Offline Galron

  • Hero Member
  • *****
  • Posts: 802
  • Karma: +15/-0
    • View Profile
Quote
Simulation is in my experience more often defined as Nemok described. Like those Game & Watch simulations that this guy made long ago, it was just computer programs that looks and feels like Game & Watch games without doing any kind of emulation at all. Now when many Game & Watch systems have been decapped, studied and got their ROMs dumped, accurate emulation has become possible in Mame.

Right that is one usage of 'simulation', context matters. There are also 'software simulation' and 'hardware simulation'.  Game & Watch falls more to the software simulation side of things.

https://www.ispringsolutions.com/articles/what-is-software-simulation

Hardware simulation is something more complicated, and generally more abstract in that by one definition it tends to be something that is run in the background, and used to test how hardware will theoretically work, but tends to be less 'interactive' for purpose of debugging something... It might be real time, or different speed dependent on how it is used.

https://en.wikipedia.org/wiki/Hardware_emulation

Note the self-contained "Ikos NSIM-64 Hardware simulation accelerator."

Quote
The low-level and high-level definition looks strange to me. Usually low-level emulation is when the system is emulated down to a low hardware level, which normally means more accurate emulation and high-level means a high abstraction level and more inaccurate emulation.
You are also quite right, in that there is another definition for low-level and high-level...

The above definition is the one based on some of Near's writings, and which he separates 'software based emulators' into three types, low-level, medium-level and high level...

But others reverse the meanings for only low-level and high-level which may be on hardware or software.



Quote
https://www.pcgamer.com/how-emulators-work/

Low-level emulation

In low level emulation, the PC pretends it’s the video game console.
In low-level emulation, the PC pretends it’s the video game console.

Low-level emulation (LLE) simulates the behavior of the hardware to be emulated. The host computer will create an environment for the application to run where it’ll be processed, as closely as possible, as the emulated hardware would do it. For the most accurate emulation, not only are all the components simulated, but their signals as well. The more complex the system, either by having more chips or a complicated one, the more difficult it becomes to do LLE.
RECOMMENDED VIDEOS FOR YOU...
CLOSE

LLE can be achieved via hardware or software. In hardware, the actual hardware or something that can substitute it resides in the system itself. The PlayStation 3 in its first two models did hardware emulation by containing the actual hardware used in the PlayStation 2. Older Macintosh computers had an add-on card, called the MS-DOS Compatibility Card, that contained a 486 processor–based system to run x86 applications.

Software low-level emulation is as it sounds, it simulates the hardware using software. Many retro video game consoles and 8-bit home computers are emulated this way by using well understood components (it’s harder to find a popular system that didn’t use the venerable MOS 6502 or Zilog Z80). One aspect that can make or break an emulation is how often it syncs up each emulated component. For example, the SNES emulator Higan aims to be very accurate by increasing the amount of times the components sync up with each other. This allows for games that had timing hacks or other quirks of timing to be playable. The cost of this however, is that Higan requires a very fast processor relative to what it’s trying to emulate.
High-level emulation

In high level emulation, the PC provides software hooks so the game can run on its hardware.
In high-level emulation, the PC provides software hooks so the game can run on its hardware.

High-level emulation (HLE) takes a different approach to simulating a system. Instead of trying to simulate the hardware, it simulates the functions of the hardware. In the mid-'90s, hardware abstraction was spreading to more computer systems, including video game consoles. This allowed for ease of programming as now developers didn’t have to invent and reinvent the wheel.

Hardware abstraction is a way of hiding the intricate details of controlling hardware. Instead, it provides a set of actions that a developer commonly uses and does all the little details automatically. An example is how storage drive interfaces came about. Originally, if a developer wanted to read data from a drive, they had to command the drive to spin up, position the read/write head, and get the timing down to read the data, pull the data, then transfer it over. With hardware abstraction, the developer commands “I want to read at this place” and the firmware on the drive takes care of the rest. An HLE takes advantage of hardware abstraction by figuring out what the command(s) are intended to do in the emulated environment, and letting the host hardware do the rest.

HLE has three primary methods of simulating functions of the hardware.

    Interpreting: The emulator executes the application’s code line by line, by mimicking what each instruction is supposed to do.
    Dynamic Recompiling: The emulator looks at chunks of the application’s processor instructions and sees if it can optimize them to run better on the host computer’s processor. This is opposed to running each instruction one by one, which usually results in lookup overhead penalties.
    Lists interception: Co-processors, like the GPU and audio chip, that have enough hardware abstraction require the main processor to send command lists. These are a series of instructions that tell the co-processor what to do. The emulator can intercept the command list and turn it into something the host computer can process on a similar co-processor. For example, command lists going to the emulated system’s GPU can be intercepted and turned into DirectX or OpenGL commands for the host’s video card to process.

An example of an HLE is the Java Virtual Machine (JVM). Java code is not actually compiled and run natively on the host machine, but instead, the host machine runs an emulator of a theoretical Java machine. Applications made for Microsoft’s .NET framework also run in this fashion. This way of running an application is commonly known as just-in-time (JIT) compiling.

The performance HLEs can provide is such that it was possible to emulate the Nintendo 64 on a Pentium II processor in 1999, three years after the console’s release. In fact, this is the most likely way the Xbox One can emulate the Xbox 360, despite running hardware that isn’t vastly superior to it.

However, it should be pointed out that the low, high, medium usages applies only to 'accuracy' of emulation, not the type of emulation. L/M/H accuracy is not the same as "Low-Level/High Level Emulation...

Quote
Low accuracy

An emulator isn't accurate when it has a large amount of visual and audio glitches and favors performance as much as possible. To work around these glitches, emulator developers typically include game-specific hacks (and prioritize popular games) to skip over problems, such as compatibility issues that can cause games to break. Many times, these emulators will be deemed incompatible with the less popular games. As Near (then known as byuu) explains in a 2011 Ars Technica article linked below, Speedy Gonzales: Los Gatos Bandidos will soft-lock towards the end due to a specific hardware edge case that isn't emulated in ZSNES or Snes9x, but is properly dealt with in his own emulator higan due to his documentation of the system. This can also become very problematic when ROM hacks abuse software errors to create otherwise impossible behaviors to achieve what they can. When a ROM hack can only be used in that one specific emulator, he explains, it becomes incompatible with real hardware (either through a flash cart or printed), and that such an issue has occurred with ZSNES before and continues to occur with Nintendo 64 ROM hacks.

Newer emulators tend to favor High-Level Emulation (HLE) as opposed to Low-Level Emulation (LLE), which results in lower accuracy. While emulators like Dolphin favor accuracy but still retain HLE for performance and have successfully used it to an advantage, these types of exceptions are uncommon and it can still hinder accuracy.

Medium accuracy

Most emulators headed by multiple developers tend to have fewer glitches but still, have many problems.

High accuracy
Emulator developers often strive for high accuracy when the system cannot effectively be cycle accurate. Their emulator replicates the components of the original system as closely as possible, and as Near explains it's that reason that more processing power is required to do so. This results in fewer audio and visual glitches and better handling of edge cases used by creative game programmers. An emulator with high accuracy may or may not be cycle-accurate and sometimes, they achieve 100% compatibility with commercially released games.

This is yet another case where 'experts' each have their own definitions and there is no single standard... It all just depends on which industry you come out of really... engineers, programmers, etc.

While there are 'experts' that agree on certain aspects, not all 'experts' use the terms the same... To simply look at one small group of 'experts' is "appeal to authority" bias.. Especially problematic when the authorities have almost no agreement, and the terms definitions change depending on the 'context' of how they use them. So its important to read the context, and not just the terminology.

As someone astutely points out:

Quote
https://en.wikipedia.org/wiki/Talk%3AEmulator

I think a problem is there is no agreed-upon definition. A historical definition is "emulation" uses hardware support. That's not typical usage today but was typical for a long time. "Simulation" is sometimes used to mean "works like a device which is not yet built", while "emulation" is sometimes used to mean "works like a device which is no longer built." Again, that's not an agreed-upon definition, but I hear it used. Also "emulator" as a way to "work like the original", while simulator as a way to "measure something like the original". Also, notions of accuracy, where "emulator" may be either less or more accurate than a simulator. The dictionary definitions suggest "emulate" is more accurate than "simulate". Perhaps the clearest approach is to list the various (confusing and sometimes contradictory) uses, so when people use a term speakers are included to say "simulator, by which I mean..." and listeners are inclined to ask "emulator, by which you mean...?"

Quote
Appeal to authority is a common type of fallacy, or an argument based on unsound logic. When writers or speakers use appeal to authority, they are claiming that something must be true because it is believed by someone who said to be an "authority" on the subject.
Quote
I think it's necessary to use separate terms for "software based emulation" and "hardware based emulation", otherwise this discussion wouldn't have happened, and it comes up all the time. The word "based" can't be skipped either because "software emulation" sounds like it's the software that's emulated, both types of emulations are hardware emulation (they both emulate hardware not software). Or something more definitive like "MPU (microprocessor unit) based emulation" and "PLD (programmable logic device) based emulation".

This the point I've been trying to make from the start, if you don't specify the the type, which the reseller clearly didn't. Then most people assume you mean 'software based emulation'... The problem is 'software based emulation' can also be confusing as there are software emulators that also emulate at hardware level. Such as Higan for example.
« Last Edit: October 12, 2021, 12:51 PM by Galron »

Offline Nemok

  • Jr. Member
  • **
  • Posts: 91
  • Karma: +0/-0
    • View Profile
Galron, I find no interest in trying to discuss here, you made that impossible.

You may keep quoting the entire web if it pleases you.

Offline Galron

  • Hero Member
  • *****
  • Posts: 802
  • Karma: +15/-0
    • View Profile
Galron, I find no interest in trying to discuss here, you made that impossible.

You may keep quoting the entire web if it pleases you.

That’s fine. Considering I think this discussion goes well and far beyond the scope of the original intent of this tread and the claims made by the reseller.. Which’s clear that all flashcarts use FPGA and work essentially the same... and one isn’t “only emulation” and other isn’t only “hardware”...

Offline nuu

  • Hero Member
  • *****
  • Posts: 2338
  • Karma: +99/-2
    • View Profile
It's not uncommon for a term to have many different definitions depending on the person. That's why you have to describe your definition when doing some kind of write up.


Game & Watch falls more to the software simulation side of things.
I don't get this though. These simulations simulates both hardware and software. It was believed for a long time that G&W systems was all hardware like most of the oldest arcade systems and video game consoles (such as Pong), but we now know that G&W uses a microprocessor and program software in a ROM, although the CPU and ROM are both internal to the ASIC.

Offline Nemok

  • Jr. Member
  • **
  • Posts: 91
  • Karma: +0/-0
    • View Profile
nuu

Most G&W simulators are software that simulate both G&W hardware and software as a whole, focusing on the look and feel, not what's under the hood. They do not contain a single bit of code of the original, wether hardware, if that makes sense, or software.

And you're right, MAME team is responsible for the incredible efforts in reverse engineering their CPU and dumping their ROM. So we also have G&W emulators that are software, emulating G&W hardware, running the truely original software in the form of a ROM file, as we're used to.

For those interested, here's a list of G&W emulators and simulators: https://emulation.gametechwiki.com/index.php/Game_%26_Watch

Offline Galron

  • Hero Member
  • *****
  • Posts: 802
  • Karma: +15/-0
    • View Profile
The fact they dumped the rom and not just rebuilding the game from ground up is fascinating as a whole as well. The software side covers the graphics which obviously roms can’t display on their own, and are made from taking images of the actual G&w screens and the artwork from those screens.

Offline nuu

  • Hero Member
  • *****
  • Posts: 2338
  • Karma: +99/-2
    • View Profile
I think you meant the hardware side. The ROM is the software side and the physical bezel and liquid crystal segments would be considered hardware.
But yes, it's fascinating. The images are made in vector form to mathematically preserve the shapes of the artwork as much as possible and making resizing without hurting the resolution possible. Hopefully it's enough information to manufacture a good replacement screen for a G&W game.


nuu

Most G&W simulators are software that simulate both G&W hardware and software as a whole, focusing on the look and feel, not what's under the hood. They do not contain a single bit of code of the original, wether hardware, if that makes sense, or software.
Isn't that exactly what I was saying?