I caved in on dependencies and added "dplug" just to use the delay line in it, rather than write my own(not that it's incredibly hard, but I'm getting tired of working on basic DSP).

Here is an example of allowing delay feedback to run amok. I'll be working on fun parameters for tuning tap length, mix, and modulation next.

More submissions by triplefox for Tool Development 2015

The Kha guide is complete and is here. I broke the streak right at the end, just got forgetful after finishing, but that's OK.

Postmortem thoughts

I churned through a lot of ideas in the past few months along the lines of "making game related tools". This forced me to confront what I meant by that and how I could execute on it in a reasonable amount of time, without any meandering.

What Went Right

  • I started several good projects that I wouldn't have worked on otherwise
  • I finished at least two really worthwhile ones, Pixelbound and the Kha documentation, as well as developing library code for others
  • I developed a broader personal/creative identity - broke out of the "gamedev" or "programmer" pigeonhole into something more rounded
  • Was not stressful on time, it simply reoriented time I was already going to spend

What Went Wrong

  • A weekly deadline made it hard to ship interesting things so a lot of posts were simple blog post updates
  • It didn't turn into a community interaction, really
  • It's the end of the year and I'm still on a job hunt?

Conclusions

This was a really good use of Streak Club, and I should come up with another streak for next year, with a different goal for stretching myself. Even if I only present blog posts, it helps with habit-forming.

link

It's not done yet but it's close to usable!

I got sidetracked by holiday-related things. Still, a little bit of writing progress, and the API updated so now I'm going to have to go back and fix the old work.

It's coming along. I have an example game in progress.


I've been working on a guide for using the Kha software framework. It builds on Haxe to provide a highly portable abstraction for games and interactive media application, including graphics, audio, input, etc. The guide will serve as an introduction to the API, and also give Helpful Tips for Game Programming.

Uh, it's not done yet. Maybe it will be by next week! *Hopeful*

I settled on a 2-tap delay running at the same oversampling rate as the oscillator. The first tap is an allpass filter(attenuates its own feedback on output); the second is a comb filter(simple copy of the original signal).

It probably needs plenty more testing before I've sorted out all the bugs, and the best way to do that is to provide some interface to explore the synth. I might make a SFXR clone sometime soon, as I get the graphics implementation together and start making it more of an "engine".

The example is a "dry" signal followed by a "wet" (effected) signal.

I intend to have simple, 2600 or Odyssey 2 style graphics in TV-GAME. But as with the synthesizer I decided to make something that is actually implausibly weird and overpowered and then cut down from there in the asset creation.

So the system I have is 320x240x8, with one palette per scanline. You can store up to 256 palettes, and assign palettes to scanlines by indexing, so you can do fast color bar effects but also typical 256-color palette rotations. I want to add horizontal scroll effects. There are sprites, and an indefinitely-sized amount of VRAM for sprite assets; all sprites are 8x8, and you can batch them into groups(maybe I will cut this). You will be able to draw a lot of sprites, enough that you could use them for tilemaps too. There are limitations here, but I'm strongly leaning towards aesthetic limits, not technical ones.

I got in two good sessions in the past two days - cleaned up performance, fussed with the anti-aliasing methods a little more while I was at it(it's simpler now, maybe less distorting but also more obviously artifacted), got the noise working, got volume smoothing in for those that need smooth/fast envelopes, and added some filter controls.

After some more testing I can try adding the delay and making a sound engine on top of this.

I've been distracted with job interview-related stuff but am still building up the synth and making some design changes. I decided that because I've succeeded in making naive oscillators acceptably anti-aliased, I could also use a customizable wavetable as the oscillator engine, rather than fixed-function oscillators. The wavetable implementation(right now) has a 7-bit depth, 16 steps, and can toggle linear interpolation per step. This means an exact reproduction of the naive sawtooth, square, triangle, etc. are possible, but also many more original timbres and dynamic timbres. When I first started with the synth concept I thought that I would have a lot of fixed-function choices, but this path still keeps things simple while adding a lot of flexibility. I thought of adding waveshaping, but it's also not necessary if the wavetable itself can be customized to this degree.

I still have to resolve how I'm doing the noise/distortion tones, finish a "real" implementation that integrates everything into a usable form for real-time, and do tests on the other parameters and see what ranges and resolutions work well(filter modes and parameter resolution is a kind of big topic). And then add some effects, if my resolve on that holds(reverb probably no, delay probably yes).

In conclusion, it's coming along, and then I will proceed to work on the rendering of TV-GAME.

And I realized that I don't need to write my own allocator to do what I'm planning. Yet.

(Still working on synthesizer-related stuff)

I didn't end up doing graphics. Instead I figured out what the oscillators on VX-1's synthesizer, "AUD1", will behave like. The problem is that I wanted to have weird, colorful oscillators that may alias a lot - dropout distortions like the Atari 8-bits, chaotic functions, etc. - and so I can't just assume the input signal is good and work with it directly.

So I worked on making a cleanup strategy for naive oscillators that makes some compromises between anti-aliasing, high frequency preservation, and processing time.

For each output sample, I do this 4x:

1.5x oversample the original function with (osc(z) + osc(z + 0.5)) * 0.5

(This is actually a tricky simplification of using a "real" FIR filter kernel which relies on most of the terms cancelling out at the midpoint)

Run the resulting sample into my old IIR filter design, using the same 1.5x oversampling trick with (last filter input + this filter input) * 0.5. The IIR maxes out at samplerate / 2.5, which gives some additional headroom beneath the Nyquist frequency. Cutoff control of some kind will exist, possibly chained to amplitude control.

Output the final low-passed sample of the IIR.

It still aliases, you can hear it alias in the attached demo in the upper ranges. But the bulk of the signal is clear enough, which is what I'm aiming for, and while heavier oversampling improves it, the 4x/1.5x/1.5x strategy is "good enough" to work in a mix, especially as the filter cutoff dips lower.



The plan as of right now is to take this design, build 16 oscillator types around it(primitive functions, various noise/distortions, some fun/interesting stuff), a reverb and delay effect, and then some simplified amplitude and filter control. A key design goal is that the "knobs" on this have low granularity everywhere - there are exactly 256 pitches available, 16 volume steps, etc. It's inspired from early 80's chip sounds, but then will add things that give it optional shimmer and polish and make it a chameleon of an instrument.


I might even add some kind of lo-fi PCM.

test.wav10mb
1 download

This week I reproduced a memory allocation strategy for audio buffers that I had previously made in Haxe - this time in D, making use of lower level constructs. It block-allocates buffers of variable sizes, with reference counting and some padding. It is not optimal on memory usage, but instead focuses on having all the processing data in one contiguous array for cache-friendliness purposes. The buffers may optionally describe an algorithm of "integer type and 16 floats and 16 ints as parameters." You can associate blocks into an entity and have the entity describe a list of blocks as a sequential program of "algorithm 1", "algorithm 2", etc. It's designed for very loosely defined usage - algorithms might be sharing a single buffer, copying between them, etc. The unit tests work, and I also got SDL audio output working with a sine wave test, so once I have some graphics rendering working and the buffers actually doing functional work, I will feel a little more confident about it.

This week will probably be about 2D graphics. I need a visualizer, and I'm feeling inspired to support sprites and text, at least at a basic level.

I guess I'm coding an engine at this point. This wasn't what I thought I was going to do when I started, but it actually lets me make the tooling all work together in a seamless design, which is Important To Me because it adds a lot of value.

All I really did this past week was try to get started with the D language, which, after having exhausted other shiny new options, I decided on for my native code projects:

  • Fixes the biggest issues of C++ as an applications language, but maintains a similar level of control and compatibility
  • Relatively mature - several generations of tooling and libraries, mistakes have been learnt
  • More than one implementation - low "bus factors" at this point

This isn't actually my first time trying out D, but the previous time I was still in high school, I think, and did not really have the background to do anything useful(plus it was a much less mature environment). What I remember about that experience was bitterly arguing with my older brother about whether we should include a "z" variable on a 2D sprite because it "might" be needed.

I managed to succeed in compiling a demo for the gfm framework/library collection, and made a binding for stb_voxel_render.h, but realized that I was a bit overwhelmed trying to dealing with a new language and a new API at the same time, so I will put getting cubes onscreen on hold for a little while in favor of porting some of my synthesizer code.

My thoughts about the voxel mapper shifted around towards unifying all my tools projects. Will it be implemented? We'll see! It should all be within an order-of-magnitude of possibility, though - just a matter of development time.


VX-32: fantasy console/engine, in the spirit of PICO-8, but designed more around pushing modern hardware with brute-force, easy-to-use techniques and an extremely integrated toolchain with a large number of presets. Hard limits aren't the focus, but we're aiming for 4-8 megabyte ROMs.

Desktop toolchain - we have a "ROM builder" that functions as an IDE. Open-protocol service to automatically store, share, synchronize games - accounts, registration, etc. This is not a web or mobile-focused project, so it'll use native code. Language not yet settled.

Rendering: stb_voxel_render.h as the basis, in mode 0 or mode 1. Fez-like 8x8x8 "trile" system used as the building block for maps; texturing and vheight also applied to lend additional shape and detail to trilemaps. Tools build around trile system. Goal is something like the uniformity and low memory consumption of graphics data on NES et al., just with additional dimensions. Detailed, smaller, static maps rather than rote emulation of big Minecraft landscapes and malleability. Voxel sprites, effects rendered as additional trile passes - rotation and alignment is freeform. Particles? Sprites? TBD

Ampliscape becomes the canonical sound engine: Reformulation from "game-like" tool into a more fully-formed DAW-like environment. Synthesizer redesigned around a flexible sampler system "Amplisynth". 64 voices assignable to 16 stereo mixgroups - voice allocation is automated. 16/48khz sample playback with panning, loop points and release. one multi-mode VA filter per voice. (Same filter I currently have in Ampliscape's synth.) 8-stage envelopes with loops. some undetermined FX chain per mixgroup. Ampliscape acts as the toolchain to generate sequences, apply procedural effects, trigger sounds, etc.

Built-in ROM containing default textures, triles, and sound samples, including an ASCII font, character and environment art, various bread-and-butter sounds and synth patches. Goal is never to feel reliant on importing data from elsewhere in order to finish a project and to have a distinctive base aesthetic.

Scripting -TBD.

Scripting focus for actual game code - to be "virtual console" like it shouldn't be necessary to dig into the engine, although some method of escaping and taking control of the engine and toolchain will exist for the brave and foolish. Entity management, collision data, camera, etc. - built in but designed for scripting control.

Target for higher-level game tools - e.g. walkthrough sim maker, shmup maker, etc.

Just did some paper design this week, tried to get some interest checks going. It's going to be a "voxel level editor", probably using a mix of heightmap, tiling, and scene hierarchy techniques. Pipeline will be scriptable in some fashion to export to a variety of things.

I actually did this earlier in this week, but it's ready now. Windows only. May try testing for other platforms soon.

http://triplefox.itch.io/pixelbound

I'll be able to release a Windows build within the coming days, hopefully by Thursday so there's some time to test it before Ludum Dare. I still need to do some documentation though, especially for the project file format.

I still have to finish load and save-related stuff, and I realized I wanted to support multiple images. Maybe by next week!


The drawing-and-adjusting parts of Pixelbound are basically complete, but might go through some visual tweaks later. Remaining stuff:

  1. Finish the side panel for renaming and managing different rectangles(insert, remove, order)
  2. Add a ID tag for each rectangle(a number, associated with a project palette)
  3. Project load and save
  4. Some cleanup of the initial project load dialogue


Tool Development 2015

Game tools development, etc.

weekly from 2015-04-01 to 2016-01-01