- C++ plus languages appropriate to your OS for binding;
- ... with currently only a Mac binding (so, Objective-C for bridging and Swift for the main action over there, but it's a full document-model app: open as many pieces of Vic software simultaneously as you like);
- multi-machine (but no other Commodores so who cares?);
- cycle accurate in that all decisions are per cycle, it's just very unlikely that my 6560 or 6522 are yet making exactly the right decisions;
- most interesting feature: video is described as its genuine composite waveform, which your GPU then decodes. For always correct aspect ratios, artefacts, full interlacing support should any piece of software ever actually use it, diagonal scans etc.
On that final bullet point: I understand from earlier posts and private messages that the VIC uses a square wave rather than a sine wave to approximate the colour subcarrier. So I've been playing with that. I'm currently emulating an NTSC machine only, which I understand means that each line in any field is generated with the same phase of colour subcarrier and that the phase alternates between fields. Is that correct? When I implement PAL my understanding is the opposite: phase alternates between lines but ends up being the same between line N on field Q and line N on field Q+1.
A bunch of screenshots are attached. I've something of a cheap trick attempt at phosphor decay so the emulator displays its normal ongoing output separately from the way that individual fields look, so I've separately captured single fields where appropriate. My CRT is very forgiving and will just believe any colour burst it sees. I'm sure some of the subcarrier phases aren't correct* but I'd be grateful if somebody with experience of the real NTSC hardware could take a quick look and tell me whether I'm even in the correct ballpark.
Added: a screenshot of Frogger. Does the real hardware really show that banding on ostensibly solid colours? It's completely eliminated if I use a sine wave, but appears with a square, so I don't know.
(EDIT: but, again, just to emphasise this: it's an actual one-dimensional composite wave that's actually decoded by the GPU. Not a post-processing effect on an initially perfect frame buffer — it's not merely some attempt to try to recreate the feel of a TV without actually having emulated one.)
* in NTSC the phases I picked ended up making a nice pattern of even colours being spaced 1/8th of a colour cycle apart and odd colours always being diametrically opposite, but I'm not necessarily convinced by that.
EDIT: the forum doesn't appear to show file names (?), so:
A sine-wave output, I guess similar to what you'd expect of a VIC-II:
A single field of that sine-wave output:
What happens if I replace the colour subcarrier with a square wave:
... and a single field of square-wave output:
That same single field with the saturation turned up:
A screenshot of Frogger with square-wave output, showing wide-area colour banding that I suspect may be an error:
All captured at full-screen resolution on my development computer, which is a c.2001 MacBook Air and on which each instance of a Vic-20 takes something like 35–40% of a core to run but doesn't have audio yet. So that'll go up a touch. Output scales to any resolution and otherwise sits in a normal desktop window, being sized and placed however you want, etc. It aims to be a completely natural native app, which is why the front-end part is platform specific. The emulation stuff is all platform neutral though.