[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [oc] Beyond Transmeta...



> I don't know if I clearly understand your network centric program.
> But - suppose that one action occurs. Mouse click or anything. Then
> you have to execute let say 100 sequential instructions, ILP = 3. Let's
> say your computer would take 1000cycles to do it, and average RISC
> 50c. But due to symplicity you could have 50% higher clock speed, but
> it is still a lot slower (assuming communication cost is zero).

But one thing to remember is that, the network is persistent, assuming you 
have a large amount of memory resources, you can have complex branching so 
that the mouse button represented by 1 bit, will cause a chain reaction of 
events when its value changes, think of the network and imagine that the bit 
change causes like a lightning strike through network. If self modification 
is done good enough you can have so that when you move the mouse over a 
window that the kind of pathway for the mouse button bit is changed, so when 
the x or y coordinate bits change they to cause a chain reaction which alters 
the network of the mouse button bit. Another way the network can be arranged 
to handle this, is that the bit change itself causes a chain reaction of 
comparisons of the mouse x and y to the window rectangles. It would be up to 
the network to arrange itself optimally to the system. The other way the 
network could arrange itself is like CISC or RISC, where the mouse action 
causes the network to fake an interrupt which causes a network which 
represents the processor to cause instructions to move down a pipeline of 
bits so that it can process the higher level instructions.

This is not entirely diffrent then normal systems, like for example windows 
more then likely checks to see if the mouse has moved into another window 
when the mouse moves, so that when a mouse click occurs it can quickly send 
the click message to the window, but a much clearer thought on this would be 
that when you move the mouse you cause some of the x and y coordinate mouse 
bits to change (not necesarily all of them), the bits that do change, only 
effect the bits connected to it, and only if those connected bits change do 
they cause other connect bits to change, this way you can scatter the usage 
of the 1bit processors amongst many diffrent operations that are happening 
simulataniously, and only changes cause updates. If you think about it, 
diffrently, in a serial processor program the mouse x and y coordinates will 
have to be compared to a rectangle each time the mouse changes, other things 
running in the background are interrupted by this action, and the comparisons 
waste a lot of the CPU as a resource by tying it up to do a recomparison of 
all the bits, as oposed to only tying up some resources and only tying up 
enough to compare the changed bits. This is close to what is occuring, but 
performance is still an issue because of the independance that is achieved 
has more requirements, where as in a normal CPU there is less bit 
independance, and so we have to deal with things in groups.

> That seems good. But I am still worried about latency and speed.

It should be a major concern, that is why implementation is important, there 
may be some technology that is being discovered or going to be discovered 
which could make a huge diffrence and alter the way things are done, but 
today I'm not sure its possible to achieve a good competing product, and it 
just may well be better to have these ideas in the back of our minds, so that 
we can take advantage or know the advantages of a kind of system. I think 
exchanging the ideas and concerns is very important, it marks the challenges 
ahead and the benefits of such an architecture, and helps us imagine what we 
could do with it.

Leyland Needham