more writing chapter 5.1
here are all the bits and pieces assembled for the conversation with kurt and nathan about quantum physics and quantum computing, some of it reworded to condense everything into idea points, and some of it as notes, still. it’s kind of in proper order, and more of it will be thrown out as i go along. but i thought i would go back and start working out the chapter now, instead of plodding along sorting research notes.
classical computers shaved decisions down to black or white, yes or no, one or zero. quantum computer complexifies this process. continuous complex variable. not just one or zero, one AND zero. superposition of states, much greater range of answers, completely different kinds of questions. the possibilities open out in front of me. but don’t ask me for details. i’m bullshitting here.
can be so lightfingered to see test results without running any tests. (The power of quantum computing, in an algorithmic sense, results from calculating with superpositions of states; all the states in the superposition are transformed simultaneously (quantum parallelism) and the effect increases exponentially with the dimension of the state space. The challenge in quantum algorithm design is to make measurements which enable this parallelism to be exploited; in general this is very difficult.)
these superpositions have to be in all places at once, and this is described by a possibility wave function. when you measure it you’re modulating it with its conjugate wave, which crystalizes the possibility into a probability, and so it enters reality. because until you observe it, it’s not real. it’s a figment of your mind.
but we actually see what we expect to see, and by that i don’t mean denying a reality in favor of our own prejudices, but the results arrange themselves to fit our test parameters. even if we decide what those parameters are after the test is run. so you ‘force’ the qubit to ‘decide’ what state to ‘have chosen’ before you postselect what to test for, and it ‘obeys’. you can’t see wave function but it can see you. why would it care, alter behavior, not dog. uncertainty is built in, quantum computer uses this as feature instead of flaw.
past generations thought there was an independent world out there that had existed forever and followed its own rules had nothing to do with them.
We’re living under these intertwined illusions about space and time. like that it’s smooth and continuous, when it’s all quantized at quantum scales. nothing like smooth or continuous. empty space isn’t empty. But the atoms in your finger never actually touch another solid thing. It’s all interaction of electrical fields that repel each other a long way from the actual object.
space and time mutate. distance between objects mutates, so there’s no such thing as a fixed measurement by which everything else can be measured. fixed constants are different in other parts of the universe. the idea of separation between things is also wrong, because quantum entanglement links every particle to every other particle.
The past, present, and future are all there at once, but your consciousness tends to move only one way along the forth dimensional timeline. Because you’re conditioned to. But it’s only a convention, according to quantum physics. Who’s doing the conditioning? That’s a different question. You might as well ask who’s we? That too. so what is space? good question. Is the glass half empty, half full, or imaginary? space has no meaning at all for entangled particles. maybe it’s empty. maybe it’s not empty – maybe it’s full of energy. maybe it’s all in your mind. you’re the observer.
unobserved = every possible state, observed = possibility waves squared = probability curve = outcome probability = observed reality. conjugate waves. atemporality. illusion of causality reflection of possibility waves under surface.
why do we need to be explaining this? how does this affect quantum computer? because there’s nothing mystical about conjugate waves except among the woo physicists, the tinfoil hat phds. then why mention it? because kurt is very interested in woo physics, and feels he’s about to make a breakthru, but kurt’s really another flake who doesn’t know what he’s doing, else he’d be making the big bucks in a government funded lab, and his tremendous, earthshaking revolution in quantum computing was a complete accident, a product of a lost weekend of fabrication and programming.
probability is the square of possibility. how does squaring occur, where does conjugate wave come from? wave from future comes back, modulates possibility wave, creates present. offer/echo computer/peripheral transaction/handshake. consciousness problem. avoid it and work on easy problems, suggest consciousness beyond realm of science. deny, not verifiable externally, same thing as inanimate observer. fudging like who does squaring. “mind squares possibility-waves to get probability-curves / outcome probabilities, which produce probabilistic effects in the real world.”
i’m not sure where this happens, is it when the wave from the future goes all the way back to the original event and squares there, or is it where the event wave meets the echo wave, wherever along that path it might be. maybe it happens atemporally, instantaneously all along the path, maybe it happens only where it is measured, because that’s when the wavefunction collapses. but the modulation, is that where the possibility waves add and subtract from each other? mostly they cancel each other out, tho, don’t they? the event wave and the conjugate waves are possibility waves, and they add in the middle but square in the end, so we’re creating probability curves by squaring and then adding the modulations of the possibility waves to that?
memories not stored in brain like storage box but in time field like cloud. memories not in past but produced in present, interpreted. bergson, sheldrake. brain tunes into future and resonates with past, time not one way, timeless state past present future all real be here now.
brain interprets sensory input, only perceptions, dependent on neural pathway, filtered thru consciousness, not external reality. nothing exists outside consciousness.
what’s this got to do with quantum computers? nothing. sort of.
it’s all mostly autonomous, nerve reactions, conscious minds are the last to know, even after body’s in motion.. libet like heart, brain keeps going, runs everything below consciousness, but consciousness sits on top second guessing, justifying, moralizing.
passage of time comes from consciousness, from noticing. self-consciousness, illusion of personal free will, sign of neurotic guilty need to control and be right or at least not fuck up. habitual examining of ongoing brain events, inner narration, interpreting and echoing and agendas and programming. thinking the mind is in charge of the body.
lucid dreaming used to think insular cortex, anterior cingulate cortex and medial prefrontal cortex, but really network of paths interacting thru many regions, brainstem, thalamus, posteromedial cortices, cerebral cortex (dorsolateral prefrontal cortex, frotopolar regions), precuneus.
moment of lose consciousness under propofol brain starts 1hz cycle overall, while individual neurons cycle but don’t connect with the rest of the brain. neural correlate of consciousness – neural circuits in thalamus and cerebral cortex involved in coherent 40hz oscillations that bind conscious experience. resonance.
consciousness emerges out of computer-like structure of brain, with names like reductionism, physicalism, materialism, computationalism, functionalism.
problem with warm wetware is thermal noise, decoherence. perfect isolation of quantum from environment, with quantum error correction codes. then how to communicate with outside, plug or wireless. quantum states in microtubules in brain. microtubules inside neurons as self-organizing computers, macroscopic quantum phenomena are connected to brain’s neural activity. coherent bose-einstein condensates occur in neural proteins enabling unitary binding, collapsing wavefunction in pre-synaptic axon terminals. tubulin protein natural qubit, switches between two states, governed by hydrophobic pocket electron pairs coupled by London forces. If hydrophobic pocket electron pair is superposed, then protein conformation is also superposed and exists in both states simultaneously. isolated and configured, protein qubits = quantum computer. cylindrical lattice automata and alternating phases of isolation and communication. massive parallelism and specific microtubule lattice geometry (e.g. helical patterns following the Fibonacci series) may also facilitate quantum error correction (deal not only with noise on stored quantum information, but also with faulty quantum gates, faulty quantum preparation, and faulty measurements. Peter Shor first discovered this method of formulating a quantum error correcting code by storing the information of one qubit onto a highly-entangled state of nine qubits.)
marshall, penrose, bohm, hiley, jubu, yasue, hameroff, stapp, beck, eccles.
to prevent decoherence, you need to isolate the quantum computer from stronger outside forces that entangle and otherwise bleed off or interfere or overpower the computer. just array of superposed quantum bits, isolated by cooling. the cooling is to shut down everything that can interfere with quantum computation, that is anything that can observe it and make it all collapse, because that’s what you do when you run a calculation is you make it collapse by observing it, so to keep out observation you have to cool the environment so nothing else moves. because near absolute zero, everything but quantum movement stops. quantum movement continues even at absolute zero, or the ground state. ground because all movement ceases at absolute zero. except quantum. and why is this? because it continues to go between states even at a place where there’s no energy. why is there no movement at absolute zero? 300 kelvin room temperature, 1 millikelvin, az ~-460F. quantum objects can be in two places at the same time, can teleport from one place to another (quantum tunelling same thing?), are always moving, even in their ground state at absolute zero (what’s the other name for it?) quantum oscillator. only seen in individual atoms (until when?), because surrounding area quickly couples with quantum object and dampens its energy out, decoherence.
cryostat. cooling is critical still. don’t yet have room-temperature quantum. going to work around it. need to make cryostat around it, less than 1 degree above az (273.15c). near az the oscillator is in quantum ground state. can’t do it on larger scale because of decoherence, when entanglement occurs with particle’s surroundings and damps out quantum properties of particle. using laser light to slow motion of particle inside, cools it very close to ground state. light and oscillation entangle so that light transforms into vibration and back again, but we outpace decoherence and can control quantum motion in an object. cryostat a vessel like vacuum or dewar flask, cryogenic liquid helium bath. inside cryostat can get to less than 1 degree above az, can slow motion with laser and cool even more. light pulse changes into vibration and back fast enough to outpace decoherence. at zero point quantum vibration photons don’t emit energy, they can only absorb it, raising thermal vibrations. zero point energy is absorption. different from ordinary thermal vibrations. crystat – less than one degree above az = -273.15c. laser hits it, slows down x100 cooling close to az ground state.
The liquid helium bath is designed to keep the superconducting magnet’s bobbin of superconductive wire in its superconductive state. In this state the wire has no electrical resistance and very large currents are maintained with a low power input. To maintain superconductivity, the bobbin must be kept below its transition temperature by being immersed in the liquid helium. If, for any reason, the wire becomes resistive, i.e. loses superconductivity, a condition known as a “quench”, the liquid helium evaporates, instantly raising pressure within the vessel. A burst disk, usually made of carbon, is placed within the chimney or vent pipe so that during a pressure excursion, the gaseous helium can be safely vented out of the MRI suite. Typically cryostats are manufactured with two vessels, one inside the other. The outer vessel is evacuated with the vacuum acting as a thermal insulator. The inner vessel contains the cryogen and is supported within the outer vessel by structures made from low-conductivity materials. An intermediate shield between the outer and inner vessels intercepts the heat radiated from the outer vessel. This heat is removed by a cryocooler. Older helium cryostats used a liquid nitrogen vessel as this radiation shield and had the liquid helium in an inner, third, vessel. Nowadays few units using multiple cryogens are made with the trend being towards ‘cryogen-free’ cryostats in which all heat loads are removed by cryocoolers.
dwave rhenium or niobium on a semiconductor surface and cooling the system to absolute zero so that it exhibits quantum behavior. As the Times reports, the method relies on standard microelectronics manufacturing tech, which could make quantum computers easier and cheaper to make. there are several competing methods for making the qubits, including laser-entangled ions, LED-powered entangled photons, and more.
(until) epfl photonics and quantum measurements lab used light to control large object at quantum level. see w/naked eye, mechanical vibrations coupled with quantum systems – electric currents, light – and translate quantum information into light signals. you can use lasers to control individual atoms and ions, molecules, atomic gasses. you can parametrically couple optical and mechanical movements. if the coherent coupling rate exceeds the mechanical and optical decoherence rate, then you can transfer quantum states from the optical field to the mechanical oscillator and back again a billion times a second. you can control the mechanical oscillator using a wide range of quantum optical techniques, but so far have only worked using microwave fields at millikelvin temperatures. experiments have failed because of mechanical decoherence rates as well as optical dissipation. but these guys did it using cold photon baths, and slowed the mechanical ioscillator. make decoherence free transportation thr optical fibers, mechanical oscillators become quantum transducers, or can use in microwave/optical quantum links.
what do you need to build a room temperature quantum computer? isolation, a barrier that either absorbs all energy or better yet reflects it. once you’ve built that you’ve got a black box and can use magic to build the kernel. which we’re going to use anyway because we don’t know what the fuck we’re talking about here. so a perfectly reflective surface, nubbed all over like the venus of willendorf or a corn cob, shaped like a teardrop or kernel because of the special properties of these shapes. where am i going to find information on this? but what to use as a shell? has to be perfectly reflective so nothing gets thru, shielded, magnetic rather than mirror, as well as mirror becuase not absorptive means reflective, plus radiates heat outward, what gets thru nothing means no communication. how to do that if nothing gets thru. move. tunelling information channel sends and receive thru shell, oscillating something on a chip outside the kernel linked to everything else. it’s the core, diff between core and kernel? like cpu and ram diff. core processor right term?
Benioff, 1982; Feynman, 1986; Deutsch, 1985, Deutsch and Josza, 1992)(e.g., Shor, 1994) Bennett, 1995; and Barenco, 1995. erik lucero ucsb, andrew cleland, horst stormer italian quantum
von neumann architecture since 40s – cpu and ram, work at UCSB. each superconducting qubit is capacitatively coupled to dedicated to memory resonator as well as to quantum information bus. bus couples qubits during operations, memory resonators store current state of qubit. qubit passes into memory resonator and is place in ground state. computation using combinations of toffoli gate and simple rotations (which?) with 98% fidelity, but not good enough because of memory resources and quantum coherence time.
Historically, computers evolved from the von Neumann model, which is based on sequential processing and execution of explicit instructions. On the other hand, the origins of neural networks are based on efforts to model information processing in biological systems, which may rely largely on parallel processing as well as implicit instructions based on recognition of patterns of ‘sensory’ input from external sources. In other words, at its very heart a neural network is a complex statistical processor (as opposed to being tasked to sequentially process and execute). In modern software implementations of artificial neural networks the approach inspired by biology has more or less been abandoned for a more practical approach based on statistics and signal processing.
neural computing that is a step beyond digital computing, because it depends on learning rather than programming and because it is fundamentally analog rather than digital even though the first instantiations may in fact be with CMOS digital devices.
a biological-, or molecular-based solution to computing, which involves the replacement of silicon-based transistor technology with organic or inorganic molecular and biological material through the process of constructing switches using molecular material that exist in the nanometer size range. The protein, bacteriohodopsin, advances in computing architecture and technology to replace silicon can be found in the use of inorganic substances, such as lithium niobate used in the development of holographic memory, which exceeds the memory capabilities available from the use of bacteriohodopsin.
Nanobiotechnology provides the means to synthesize the multiple chemical components necessary to create such a system. one can engineer a biocomputer, i.e. the chemical components necessary to serve as a biological system capable of performing computations, by engineering DNA nucleotide sequences to encode for the necessary protein components. Also, the synthetically designed DNA molecules themselves may function in a particular biocomputer system. The economical benefit of biocomputers lies in this potential of all biologically derived systems to self-replicate and self-assemble given appropriate conditions (349).² For instance, all of the necessary proteins for a certain biochemical pathway, which could be modified to serve as a biocomputer, could be synthesized many times over inside a biological cell from a single DNA molecule, which could itself be replicated many times over. It also turns out to be non-trivial to program unconventional machines. Not all problems can be decomposed to take advantage of high degrees of parallelism. And (outside of PhD theses) there aren’t a lot of tools to help programmers. Not to mention that many of the problems that require high degrees of parallelism can be attacked with special-purpose hardware — I’m thinking graphics cards here. So basically, cheap and easy beats massively expensive and hard to use.
Today, transistors on integrated circuits have reached a size so small that it would take more than 7,000 of them stacked next to each other to equal the thickness of a human hair. The transistors on Intel’s latest chips are only 14 nanometers wide — the average human hair is about 100,000 nanometers thick.