20150829

Ketogenic Diet - Working on Year 2

Part way into year two on a Ketogenic diet, breaking the diet only once and a while on business trips. The diet is basically mostly fat, some protein, with almost no carbs.

Initially established in the 1920s as a way to control seizures for people with epilepsy, the Ketogenic diet is being successfully used as a metabolic treatment for cancer by a few individuals, but is largely being ignored by medical professionals. The diet works by shifting the body's metabolism from glucose (sourced from carbs or converted from excess protein) to ketones produced in the liver (sourced from fat and oil). The diet has natural anti-inflammatory properties. The theory of how a Ketogenic diet fights cancer revolves around the idea that cancer is mostly a mitochondrial metabolic disease. Specifically that cancer cells tend to have damaged mitochondria which switch to a more primitive glucose fulled fermentation as their primary energy generation process. Starving cancer cells of glucose places them in extreme metabolic stress, allowing the body to fight back. One of the primary ways to track cancer is by looking at the process of tumor angiogenesis via periodic MRIs with contrast. Effectively watching over time as the cancer causes the body to develop ever stronger network of blood vessels to feed the cancer with glucose. Successful treatments of cancer can reverse this process. I suspect ultimately that everyone has cancer even if only at some undetectable amount. The question is if the body's balance shifts between a state which enables the cancer to grow, or a state which causes the cancer to die. Cancer becomes terminal when there is no longer a way to shift back the balance.

The Ketogenic diet for me is a lifestyle choice not made out of medical necessity. My personal tastes tend to really align with the diet, and it is a great way to stay in shape, more so when you have a career sitting at a desk typing away on keyboards. Counter to how the media vilifies fat as the source of the nation's obesity problem, it is near impossible to maintain body fat on the diet which involves mostly eating fat: the body is in a constant state of fat burning, instead of fat storage.

Looking back, it was relatively hard to get started. The realization that America's entire food culture and supply chain is optimized for the delivery of carbs, leaves a demilitarized zone filled with land mines for the oil fueled consumer. Just finding things which are in the parameters required for the diet can be quite a challenge. After a while, planning every meal, weighting all ingredients, measuring ketone levels or blood glucose levels, is replaced with driving by feel alone. The transition between carb burning and ketone burning body state goes through a standard process of horrible sugar withdrawal symptoms, bouts of fatigue and brain fog, eventually returning to the feeling of being normal, but then unaffected by the standard cravings carb eaters have. The first transition takes weeks, however after being ketone burning for this long, the transition now only takes me a few days.

Over time the diet becomes as enjoyable as the standard high-carb diet, and even more so in many regards, because of the ability to easily take in 70% fat at a given meal (like bacon wrapped sour cream). Unlike sugar, there is no crash afterwards, and the body provides some rather strong signals to stop eating before you over do it, instead of telling you to keep going as is the standard practice with sugars. Here is an example of the kind of foods my wife and I eat: butter on low heat, mixed in spice, garlic, and boiled shrimp. Consumed head, shells and all, with sour cream on the side straight to bring up the fat content,

Atari Shock Reloaded?

Maybe this post should just be called "Indie Shock". Interesting graph below posted on twitter of the number of Steam game releases over time. Saturating isn't it?



Thoughts From Personal Perspective as a Consumer
Engines like Unity and Unreal make it much easier to produce games, but the games tend to be more similar, staying within the limitations imposed by the these mass market engines. Same effect happens as independent engine tech all falls into the same local minimum, or developers limit risk by staying in the confines of well walked genre. This makes it harder for a consumer to differentiate between titles. Choice in a sea of noise is random. It is not as much the content which shapes purchase decision in that case, but rather how the consumer gets directed by marketing.

As it becomes harder to choose, and as more choices result in failure of satisfaction, the barrier to purchase increases, and even the falling price cannot compensate. The price of free is actually quite high: the opportunity cost of doing something more compelling with one's time.

"Nobody Cares" aka the Excuse For Being Mediocre
Why bother investing time to achieve greatness? Proponents of this line of thinking often present justification in the form that the average consumer cannot tell the difference between low and high quality. For a producer this is effectively a self selecting choice to continue to swim in that sea of noise. Some forms of greatness may not be perceived at the conscious level, may not be something a consumer can articulate in words, but instead may only manifest in feel and yet have profound effect.

Outliners
Knowledge of excellence in some aspect which effects only a fraction of the market, say awesome multi-GPU support, establishes a hint that the producer cares about the product at a level beyond serving me a microwaved pre-constructed hamburger. It is very hard to maintain employment of the creative and driven individuals which produce top content without allowing them to strive for greatness, even sometimes at the compromise of maximum profitability.

As a consumer in a sea of noise, I select for the expression of producers looking to be the best that is possible. Who, given the limitation of architecture and time, choose paths which compromise in a way which allows a unique realization of the ultimate form of their art.

20150818

Quick ACES Thoughts

Appears that the RRT global de-saturate step applied in AP1 drops to a gamut smaller than Rec2020. This seems to be ok when targeting Rec709/sRGB but not sure if this is future proof in the context of Rec2020. Seems like the reference ACES ODT for Rec709 at 48 nits ends up with gamut clipping when inputs to the RRT had covered the full positive part of AP1 space. Those working with sRGB/Rec709 primaries in the rendering pipeline might not have issues here depending on how much saturation is added during grading before the RRT. Guessing some people would rather be able to go nuts anywhere in the human perceptual space and have it smoothly map to the output display space?

20150814

The Written Word

Growing older, I find that games, movies, TV are all limiting forms of entertainment, and that by far the best form of story driven consumables is the book. Right now I'm half through On the Steel Breeze by Alastair Reynolds, taking a break to reflect. Something was lost over the years as digital entertainment has evolved from the soup of interactive text adventures. Certainly enjoy the visual representation of a good story, but I enjoy more the freedom to explore stories which could never gain the support necessary for a non-literary translation. Early in gaming there was an interesting balance forced by the limitations of the machine, where the written word took the place of electronically "physically" realizing everything in the game. Would be great once and a while to trade the modern game single player storyline, played out in "cut scenes", with a story of the caliper of a great novel, represented instead in "cut pages" of text. Then shifting the focus of development and polish back into the game itself.

20150812

Cloudhead Games : Blink Locomotion for VR

Blink locomotion for VR is quite a cool idea ... but beyond the usage for removing VR sickness: for graphics. Brings back the feeling of classic adventure games. Fixed spaces in which the player interacts with instant connectivity between the spaces. The opportunity for graphics is to pre-compute the spaces to extremely high fidelity. Effectively pre-solving the visibility and light transport, with a secondary system which composites in the dynamic 3D elements into the scene...

20150810

1536-5 : Keys

Evening 5 on 1536. Wrote a mini PS/2 keyboard driver (source below) based on prior work. Ran out of time for testing, got distracted by SIGGRAPH slides. Only supporting 64 keys (bit array in register), good enough to run arcade controllers which alias as keyboards. Only supporting driver key release on {shift, control, alt}, allowing application to clear bits for release for other keys. Had an interesting bug today: forgot to implement the "MOV REG,REG" opcode, surprised got this far in 1536 without register to register move. Manually keeping 16-byte groupings for instructions has some interesting side effects on coding style...

SIGGRAPH : Ready At Dawn

Ready at Dawn is starting to post SIGGRAPH content: readyatdawn.com/ready-at-dawn-siggraph

GL and Vulkan at SIGGRAPH 2015

The biggest news is that Google is going to ship Vulkan on Android. Vulkan is set to become the best option for cross-platform portable lower-level graphics development: Android, Linux, SteamOS, Windows 7/8/10/etc. Vulkan has some great advantages: (a.) Vulkan is not locked to a given OS version, (b.) Vulkan has an extension system both in the API and shader language which enables hardware vendors to expose features of the hardware and enables the API to continue to rapidly improve, (c.) Vulkan as an open standard promotes great 3rd party support (see what people are already doing with SPIR-V)...

GL released extension specs for a lot of great new features including ARB_shader_ballot. This is a great step forward in the process of getting some support for basic ISA functionality which has been shipping in hardware for the past 3 years.

20150809

1536-4 : Coloring

Night 4 on 1536. Brought up most of the "x56-40" (x86-64 in hex) assembler now. Also have majority of the forth-like words needed to assemble self-documenting constants {add,mul,neg,not,and,or,xor,...}.



Started on the editor. Just enough of a quick prototype to render the text view in the editor (sans cursor for now). All screens on this post are captured from the editor running in an x86-64 emulator. Keeping the fixed 64 character lines makes everything very simple. Syntax highlighting was carefully designed to only need one line of context. Just a simple backward sweep to color, then a forward sweep to correct the color for comments (the \ marks rest of line as comment). Adjusted the font, {_,-,=} all now extend out full font cell width so they can double as lines. Adjusted the colors closer to what I like for syntax highlighting. Still experimenting with how to comment and arrange source.



Have 16 characters to the right of the source window to use for real-time debug data. Like viewing values of registers, memory, etc. Thinking through details in the background. Next step is to bring up the non-USB throw-away keyboard driver, then get the editor functional.

Bugs
Still finding the no-errors, no-tools, know-everything path, easy to work with. This time lost some time to an opcode assembly bug. A full class of opcodes was broken, something never validated from last time, just forgot to make a RIP relative offset RIP relative for non-branch instructions. Everything else working out of the box with no human errors. When the mind can reason about the entire system, and the edit/execute loop is near instant, bugs normally are instant fix. Quite satisfying to work this way.

20150802

Demo Tubes: Parnassum & Monolith




Thoughts on the Evolution of Processor Design

Feels like the fundamental limiter in the evolution of processor design is the {load,alu,store} design paradigm: the separation of memory and ALU at all scales. A CPU is effectively like having billions of people each with a mailbox to store data, each routing the data to just one single person (out of billions) with a calculator doing computation. As CPUs have evolved, there has only been a tiny increase in the number of people with calculators. Then the GPU enters the timeline, providing a substantial increase in the number of people with calculators, but this increase is still relatively tiny with respect to the number of people routing data to and from the mailboxes. I'm wondering if perhaps all people should just have calculators. Looking at some numbers,

Chip Capacity in Flop per Clock per Transistor
Using numbers from Wikipedia for Fury X,

8601 Gflop/s
8900 Mtransistors
1050 MHz

Capacity for upwards of 8000 flops each clock, but with around 1 million transistors per flop.

Science Fiction Version of the Suggested Paradigm Shift
A completely science fiction version of the suggested paradigm shift might be a chip with 256 MB of memory, divided into 32 million 64-bit cells, with each cell doing one 64-bit uber-operation every 64 clocks (bit per clock), clocked at a relatively low clock rate like 500 Mhz: providing something like 250,000,000,000,000 uber-ops per second. The board composed of 3D stacks of these chips connected by TSVs, stacks connected by interposers (like HBM). Board might have something like 16 stacks, providing 4,000,000,000,000,000 uber-ops per second. The local parallel just-in-time compile step configures cells to work around bad cells, yield problems go away. The mindset used to program the machine is quite different. Data constantly flows around chip as it is filtered by computation. The organization of data constantly changing to adapt to locality of reference. Programs reconfigure parts of the chip at run-time to solve problems. Reconfigure of a cell is basically adjusting which neighborhood connections are inputs to the uber-op, and the properties of the uber-op.