Wednesday, December 03, 2008
Polynomial-time quantum algorithm for the simulation of chemical dynamics
Via ars technica's Noble Intent,
Polynomial-time quantum algorithm for the simulation of chemical dynamics:
Building on prior work in quantum computing algorithms, the researchers here developed an algorithm capable of computing the wavefunction of a chemical system some arbitrary time t after the simulation begins at time t=0. The algorithm requires a relatively small number of qubits.
Thursday, April 12, 2007
Saving the Planet, One Line of Code at a Time
Surely us lowly computer programmers can't do much about climate change, right? I mean, maybe we could drive a Prius to work, or work for Google, but short of that, code is code, isn't it? Well maybe not. Here are a few things programmers could do to fight global warming:
* Boycott Windows Vista. Windows Vista is an energy-wasting nightmare, requiring grunty graphics cards and cutting edge CPUs just to perform basic windowing operations. By writing PC and web applications that deliberately don't work with Vista, you can help cajole users into running a less resource intensive operating system.
* Don't ouput blank or nearly-blank pages to the printer. Have you ever printed a document from Word and got a completely blank page at the end (except for maybe a page number)? Or have you ever printed something from an email application or web site and gotten a final page containing almost nothing but a useless footer? This type of waste must add up. Sure you can recycle the paper, but recycling takes energy. Better to not print the page in the first place. If the user really needs that last bit of information printed out, they can opt-in and get it printed. But tree-saving mode should be enabled by default.
* Write more efficient code. A CPU running at 100% capacity is sure to draw more current than a CPU with less load. So code more efficiently and use less electricity. (Of course if hardly anyone uses your software, and reducing clock cycles requires you to drive to work more or keep the lights on later, this could be counter-productive)
* Support older computers. Older computers have cooler processors and less memory and therefore use less electricity than newer ones (especially if they're connected to an LCD monitor). By coding for them, you encourage their use. Recent versions of Linux should run happily on older computers that wouldn't support Windows 2000, let alone XP or Vista.
Got any more ideas?
Wednesday, April 11, 2007
The D-Wave announcement and flood of articles and postings that have come from it sheds some light on the uneasy intersection of academic research and cutting edge private research.
Companies, being profit oriented, need to maintain a certain level of honesty if they want credibility with their potential customers and investors. They also need to carefully frame their announcements and dealings with the press in order to maximize their attractiveness, without crossing the line into outright deceit.
Scientists, being truth oriented, need to meet a higher level of honesty, the kind that includes bending over backwards to show how they might possibly be wrong.
Few companies could go very far if they attained academic-grade honesty, since exposing all their weaknesses would scare off potential investors. In a world where everyone exaggerates and everyone assumes everyone exaggerates, being very modest as a company is tantamount to admitting failure.
D-Wave doesn't have to please academics, because academics are not investors nor potential customers (on any significant scale). For D-Wave, exposing their technology to academic-grade scrutiny can only be bad: either it will reveal that the emperor is wearing no clothes (i.e. they have essentially built an expensive, 16 bit classical computer) or else they are really onto something, and their ideas will be revealed to potential competitors.
Friday, November 03, 2006
More on My Ideal CS Cirriculum
Alice had some interesting things to say about my previous post. Instead of responding to Alice's comment as a comment, I thought I'd make it an entry by itself.
"I go to a school where there is no computer science department, and thus all of the "computer science" courses I've taken have actually been math courses. I don't exactly think I've somehow satisfied the requirements for a undergrad degree in CS, but I am hoping to persuade some CS PhD programs to let me in anyway. I like the idea of a CS degree as fundamentally distinct from software engineering or what have you, though it is kind of strange for me to actually have this degree given the current expectations of graduate programs."
From my understanding of the current climate, I think you'll find graduate CS programs put a lot of value on applicants who have mathematical maturity. This would be especially true if your interests are more on the theoretical side than applied CS. See this discussion on Lance's blog.
"I found that understanding object oriented programming helped me when I got to my programming languages class. The concept of an object helped ease in things like functions being first order in SML (or really in lambda calculus). Would you really not want to teach this? Or just teach it conceptually, leaving out many of the details of implementation?"
Object oriented style programming is found pretty much everywhere to a degree, even in simple functional languages and plain C. But getting very deep into OOP very early is a distraction from more fundamental principles which should be learned first. A lot of huge and successful software has been written without the benefit of explicit object oriented programming. Also, most research on the theory side is completely orthogonal to anything object oriented.
I've been influenced a bit by Alexander Stepanov, the creator of the STL (Standard Template Library of C++) which is one of the most useful and well written libraries I have ever used. You might find this interview with him interesting. He finds absolutely no value in things like inheritance and virtuals. I'm not as extreme as him, but I do think that these concepts are overused.
"Most of my classes have had some programming projects, but we weren't really taught to program so much as sent off to make something that worked in the language of our choice. Often we would modify existing code, so the projects were smaller but more focused on the theoretically significant bits. Would the theory courses have programming components?"
Yes they would, which is one reason why I think a functional language should be taught from early on. Many theoretical concepts can be illustrated in languages like Scheme or ML without too much extra baggage.
"I'm kind of surprised that you leave automata to the third sequence - this was actually the first thing I learned in college CS class, concurrent with an overview of very simple programming structures (variables, loops, etc) in java. What's the rational for this?"
It's interesting you learned it so early; I can certainly see the appeal of learning it alongside your first language. But I left it until a bit later because I think a certain degree of mathematical and computer science maturity is needed to appreciate and understand automata.
Monday, March 06, 2006
A Better CS Degree?
Reading a recent post on Can't Count Sheep got me reflecting on my own CS degree (obtained nearly 6 years ago). My course work seemed to be about half practical and half theoretical, which added up to very little. I have no one to blame but myself for what I didn't learn, but my ideal degree would cover a lot more fundamentals and a lot fewer applications. I'd say it's more valuable for a CS graduate to be comfortable with Knuth than object-oriented Java.
My ideal CS curriculum
My ideal CS degree is basically an algorithms degree, with abstract problem solving as an overarching sub-text. Programming languages and mathematics are taught for the sole purpose of enhancing understanding of algorithms.
Topics like databases, compilers, AI, networking, graphics, operating systems, object-oriented programming, robotics, etc are not included. (Undergraduate HCI will be moved to the communications department.) Yes, on one level most of these types of courses are about special classes of algorithms and data structures. But it's important that more fundamental notions are well understand first before writing FTP clients and OpenGL based games. My view is applications should be learned on the job, through projects, or in a masters program, but should not displace fundamentals.
The curriculum is divided into 6 sequential lots. Classes in a lot may be taken in parallel, and there are no electives. Each lot, excepting the first, requires a project. At least 2 of the projects undertaken by each student must be software systems, and at least 2 must be theoretical research projects. The students may choose their own projects, but they are supervised by grad students and have various deliverables. Students may work individually or in groups of 2 or 3, but proportionally more is expected out of multi-person groups. Projects are also required to be increasingly sophisticated as students advance.
Most classes would involve components of theoretical coursework (i.e. problem sets) plus programming with both Scheme and C. The types of programs required would not be large and complex, but instead short and very challenging. Both C and Scheme are emphasized because they encourage thinking in two very different perspectives. Programming is included even for traditionally theoretical courses.
I thought it would be better to err on the side of too much math and not enough EE or Physics (actually none).
00] Intro to programming with C (like Stanford's 106x when it was C based)
01] Intro to programming with Scheme (like MIT's 6.001)
02] CS Math 1 (discrete mathematics) (i.e. Stanford's CS 103x)
0x] Non-CS specific math (i.e. calculus, linear algebra, basic
10] Deeper C (similar to the first half of Stanford's old CS 107. Also includes enough Unix API to serve later classes, and a small bit of assembly)
11] Introduction to algorithms and data structures (i.e. Stanford's CS 161)
12] CS Math 2 (logic and proofs)
13] Project 1
20] Automata and Complexity Theory (i.e. Stanford's CS 154)
21] Parallel and Distributed Algorithms
22] CS Math 3 (set theory, number theory)
23] Project 2
30] Intro to NP Completeness
31] Tree, Graph and Network Algorithms
32] Intro to Information Theory (similar to MIT's 6.050J)
33] Project 3
40] Searching and Sorting
41] Intro to Randomized algorithms
42] Intro to Quantum Computing
43] Project 4
50] Quantum Algorithms
51] Heuristic Algorithms
52] Provably Correct Programs (using the Z language)
53] Project 5
Course  might seem too specialized to be taken so early, but is placed here so later courses can take advantage of parallel and distributed algorithms. Both are becoming more and more important as multi-core processors and networking become ever more pervasive in systems.
Course  would include some of the theory covered in intro and mid level database classes.
Course  will not be the student's first exposure to graphs and trees, but will be a chance to go deeper than most undergraduate programs go in these topics.
It's a shame that freshman level courses like  seem to exist nowhere but MIT.
Course  will delve deep into searching and sorting, perhaps using Knuth vol III as a textbook.
Course  would include some of the more fundamental algorithms from traditional robotics and AI courses.
I'm not sure if courses like  exist anywhere. Even though Z is not commonly used, I think it would be great if a CS degree meant you could write 100% provably correct code, given enough time, if need be.
If someone would feel ill prepared for the job marketplace without being Java experts, they could do something Java related for all 5 of their projects (they could research garbage collection and just-in-time compilation for their research projects, and write Java applications for their 3 programming projects)
Sunday, September 04, 2005
Substitute Universities for Those Displaced by Hurricane
Suresh passes on word that university students displaced by the Hurricane may be offerred temporary places at universities around the country. Stanford is not on the list, which is unfortunate because most departments start classes on September 26, much later than most other schools. However, Stanford has a short message about possible admission for a small number of law students, and indicate that they are assessing the situation for other types of students.
Update 05 Sep 2005: The message has been updated: "Stanford will be admitting academically qualified students from these universities as non-matriculated students for the fall quarter, which starts on September 26 and ends on December 16. [...] Preference will be given to students from the San Francisco Bay Area. [...] Stanford will provide housing on campus for students who are accepted."
Although it doesn't really have a place here, it's difficult to not mention the shameful and tardy response by the US federal government in helping flood victims. It's not a big surprise in the light that emergency efforts were led by a man who's previous job was attorney for the International Arabian Horse Association, which he was essentially fired from in 2001.
Promoting intelligent design and ignoring warnings about looming natural threats are both endemic of anti-science thinking. Don't believe the line that no one expected this.
Update 06 Sep 2005: More commie liberal fear-mongering from several years ago here and here.
Tuesday, August 30, 2005
Evolution and Quantum Computing
This editorial by Lee Spector in the Boston Globe talks about using genetic algorithms to evolve quantum circuits, and its consequences to understanding Darwinism.
If it takes a computer program to convince someone evolution is real, then either their educators have failed them miserably, or they have been severely misled by radical (but increasingly mainstream) institutions. But it's still good to see Lee Spector speaking up, since many scientists probably feel its pointless to get into the whole debate, since it essentially pits reason against fairy tales. But the more quiet the rational side is, the more it will seem that evolution is controversial, since the wing nuts now have a voice through all levels of politics, business, and the media in the United States. After two years in Australia I don't think the situation is much different here, except the Aussies are a lot more laid back about the whole thing.
Sunday, August 07, 2005
For those not using an RSS feed reader, the site should look a bit better now (thanks to Altman). I've let the qualgorithm.com domain expire, and moved the few non-blog resources (Peter Shor interview, Best of QC) to quantum.atticrose.com. The decidedly non-quantum Attic Rose is a side-venture which happens to have a lot of disk space and excess capacity. Right now it only has a small gallery with paintings of roses and bugs, but maybe I can convince the artist to paint something a little more quantum-ey.
Saturday, June 25, 2005
The blog template seemed to break overnight. Now, new posts seem to be disappearing...I'll fix it when things have stabilized a bit with the blogger system.
Tuesday, June 21, 2005
MIT Technology Review has a short profile of D-Wave Systems, a company trying to become the first to sell quantum computers. The company's estimates for having a real product differ from what conventional wisdom would have: "The company plans to complete a prototype device by the end of 2006; a version capable of solving commercial problems could be ready by 2008, says president and CEO Geordie Rose. [...] D-Wave's first computer won't be able to accomplish the most widely touted payoff of quantum computing: factoring the extremely large numbers at the heart of modern cryptographic systems exponentially faster than any known computer. It will, however, be ideally suited to solving problems like the infamous traveling-salesman problem, in which a salesman searches for the optimal route among cities." From the article I don't understand their architecture at all; apparently instead of using entanglement, they use quantum tunneling, so I'm not sure in what sense their system will be a quantum computer.
[Update 2005-06-23] The Quantum Pontiff, reining in hyperbole like a black hole in a saddle shaped universe, pontificates that the D-Wave system will merely run quantum adiabatic algorithms.
New Paper With Huge Claims (Debunked?)
On 16 June 2005, Andreas de Vries released the paper: Fast quantum search algorithms by qubit comparisons exploiting global phase interference. The paper requires only basic understanding of quantum algorithms to follow. Assuming there's no fatal flaw somewhere, the results will have huge ramifications for quantum computation: The algorithm is able to find an item in an unsorted databse in O(log n) time, and implies that NP is in BQP. In other words, NP complete problems will be solvable in polynomial time on a quantum computer.
[Update 2005-06-23] According to the Quantum Pontiff himself, this paper is merely the latest in a long tradition to erroneously make these same grandiose claims. His post points out where he thinks the fatal flaw is, but adds more here.
Sunday, June 19, 2005
Entanglement Made Simple (PhysicsWeb) begs the (un)important question -- what do you call qudits with more dimensions than 3?
The accepted names are:
2 (binary) qubit
3 (ternary) qutrit
D (arbitrary) qudit
Clearly the first two come from bit and trit, the common terms in the classical domain, which are loosely based (I assume) on the Latin names for bases:
(Eric W. Weisstein. "Base." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/Base.html)
If you took the boring route and named them after the Latin words, this is what you might get (your results on this [fairly pointless] exercise may vary)
4 quaternary: quatrit
5 quinary: quinit
6 senary: qusenit
7 septenary: quseptit
8 octal: quoctit
9 nonary: quonit
10 decimal: qudecit
11 undenary: qundenit
12 duodecimal: quduodecit
16 hexadecimal: quhexadecit
20 vigesimal: quvigesit
60 sexagesimal: qusexagesit
Those are mostly pretty terrible, almost as bad sounding as qualgorithm. So here's a modest proposal: instead of invoking Latin, replace the D in quDit with a number (but keep qubit and qutrit since they're pretty well accepted). e.g.:
2 (binary) qubit
3 (ternary) qutrit
4 (quaternary) qu4it
5 (quinary) qu5it
6 (senary) qu6it
10 (decimal) qu10it
16 (hexadecimal) qu16it
A bit hard to pronounce maybe, but easily recognizable in print.
Wednesday, June 15, 2005
42 Quantum Questions
I started this blog mainly to explore a variety of questions about quantum computer algorithms and related topics. This, I hoped, would foster some dialog in language that curious non-experts could mostly understand, while not offending the sensibilities of professional researchers. The contents of the blog have diverged pretty substantially from that, but I thought I'd list the types of questions I had in mind anyway (along with some I picked up along the way). Note there is lots of overlap, and the answers to some questions may conditionally make other questions moot. David Hilbert, I am not. (QC = quantum computer/computing)
- Will QCs ever exist, and if so, how common will they become?
- Where does the QC hype end, and reality begin?
- Will writing QC software be fundamentally different to classical programming? What classical knowledge will carry over?
- Will future programmers need to become well versed in quantum mechanics to understand and program QC's?
- Will "Quantum Processors", if they come to exist, always be secondary to classical CPU's?
- How many physical and logical qubits will be needed for "useful" quantum computation?
- What is "useful" quantum computation?
- How easy/hard is it to understand useful quantum algorithms, and the workings of QC hardware?
- Are quantum computing pessimists on to something, or are they short-sighted? Or confused?
- What computational problems are QC's poised to handle well?
- What computational problems will QC's never be able to handle?
- How do the computer science theory and physics communities regard quantum computing, i.e. As the inevitable future? A passing fad? An interesting but non-vital supplement to mainstream research?
- Can quantum computers get us closer to proving P != NP?
- How does one go about proving lower/upper bounds of QC algorithms? What implications does this have for classical algorithms?
- If QC's never come to exist, will it be because something fundamental about physics makes them impossible? Or because they'll be too expensive to build? Or because there aren't enough potential applications to justify the effort?
- How are the various QC architectures advancing? What is the frontrunner?
- When a clear favorite jumps ahead, will the other architecture programs disappear?
- What is the trajectory of experimental quantum computing (i.e. how many qubits by 2010, 2015)? Is it too early to say?
- What is the quantum analog to Moore'e Law?
- Will quantum error correction work as advertised?
- How important/useful are Grover's/Shor's algorithms really?
- Were Grover's/Shor's algorithms low-hanging fruit, or the product of extreme insight and imagination?
- Are there any recent advances on par with Grover's/Shor's algorithms?
- Do the papers that generalize aspects of Grover's and Shor's algorithms represent important powerful and practical extensions, or do they amount to trivial tweaking?
- Is designing quantum algorithms difficult simply because it requires deep knowledge of several disciplines? Or is there something more to it?
- What quantum physics experiments, specifically, could be simulated on a QC?
- What will QC's mean for AI? Specifically: neural nets? Genetic algorithms? Something entirely different?
- Will quantum computing have any impact on: Operating systems? Databases? The Internet? Games?
- What classical models are most similar to quantum computer programming? i.e. parallel programming? Randomized algorithms? Nothing?
- If a quantum algorithm produces only a sub-exponential speed gain over its classical counterpart, is it of any value?
- Are there any efficient quantum algorithms that also run efficiently on classical computers, but are much easier to reason about in quantum terms than classical terms?
- What can the idea of a quantum computer tell us about the universe?
- What can quantum computing teach us about quantum mechanics (does it shed any light on entanglement? The measurement paradox? Teleportation?)?
- What can quantum computing teach us about classical computing?
- What can quantum computing teach us about the brain?
- Is the universe a big quantum computer, and if so, how do we tap into its vast processing power?
- Are there abstractions that will make programming QC's intuitive? Or is that fundamentally equivalent to making quantum mechanics intuitive, and therefore impossible?
- What's the most efficient way to learn about quantum computing, for someone with: A computer science background? A physics background? A non-technical background?
- Is understanding continuous quantum mechanics necessary for fully understanding quantum computing?
- Is it possible to follow all the important quantum computing developments if one does not have access to all the notable expensive academic journals?
- Quantum information theory placeholder (lots more questions here, but I've probably already gone overboard).
- Does this line of questioning miss the point? What else should I/we be asking?
This blog has done a pretty terrible job of even discussing these questions. This is mainly because it's taken me a lot longer to really understand quantum computing than I expected, which will probably be the subject for a future post, if I can find the right way to put it.
Friday, June 10, 2005
Weakest Link in First Generation Quantum Cryptographic Systems
Peter Rohde accurately points out that the first generation quantum cryptographic systems do not exchange one time pads, as he assumed, but instead exchange keys for protocols such as triple-DES and AES. According to him, "Of course this completely undermines the security of QKD, since QKD inherently derives its security from the fact that the one-time pad is the only completely secure cipher."
Peter goes on to say: "My take on all this is that customers of current QKD systems are paying hundreds of thousands of dollars for cryptosystems no more secure than freely available software packages like PGP."
[Updated 2005-06-11] I responded to this in the comments section of his blog, which you can see along with Peter's response here (below the posting).
While the original designers of QKD might have had a one-time pad in mind for the key, current technology is not quite up to snuff to support the data rates one-time pad exchange would require for encrypting realistic amounts of data. So what's the point of QKD at all until the technology gets faster?
The benefit of today's QKD, which refreshes keys at the rate of 4 to 100 per second (depending on which company and press release you pay attention to) is a significant improvement over the classical alternative of today. Why? Well as engineers know, you never want a single point of failure in a system. But today's widely-used encryption protocols rely on just that -- a master key (usually an RSA private key) to exchange all the other keys. If that master key is ever stolen or deduced, than every transmission is compromised, until a new master key is issued.
In QKD, the master key is theoretically completely secure, so the weakest link of the system are the 4-100 keys that are exchanged every second. If someone was capturing all the data encrypted this way, they would need to break each key to see all the data being transmitted. Not theoretically impossible, but no more single point of failure either.
This might not sound great to some, but others are willing to pay tens of thousands of dollars to get top notch security, even if it isn't perfect. Those in the industry know that security is a relative concept.
This all gave me the idea for a feature that today's systems may be able to support: ability to send short one-time pads. If a brief, highly sensitive message needs to be sent, a few thousand bits of one-time pad could be distributed, before switching to back to triple-DES/AES mode for regular transmission.
[Updated 2005-06-11] (According to Peter, this already exists in some systems)
[Updated 2005-06-11] In a comment below his posting, Peter writes "Another point, which I didn’t mention, is that commercial QKD systems don’t actually implement ‘true’ BB84, since they don’t have true single photon sources. Instead they use attenuated coherent states which, at least in principle, introduces some room for intercept attacks." This is a good point, which actually makes it pretty difficult to truly compare first generation crypto systems with classical alternatives. A recent breakthrough will eventually fix this problem, but not for 2-3 years, according to the article, until it's commercially available.
Thursday, May 12, 2005
Parallel Programming with Matrix Distributed Processing
This paper describing a C++ parallel matrix library has some very impressive programs as examples, like a complete parallel version of the Game of Life, written in 39 lines of C++. I'm sure there are many other parallel matrix libraries out there, but this one seems worthy of consideration especially for anyone designing a parallel quantum computer simulator.
For those who think of parallel computers as being massively large and expensive machines in the basements of research labs, or world-wide Internet-based parallel programs like SETI@home, that image is going to be changing drastically over the next 5-10 years. Single core processors are getting too hot at high clock frequencies, so chip makers are currently moving to dual core designs. The forthcoming XBox 2 is apparently going to feature a triple-core CPU. If adding cores is the only way to maintain Moore's law for the near future, then it's pretty obvious what's going to happen, and in 10 years, 30-40 cores per desktop computer or video game console could be commonplace. This will have a huge impact on CPU-bound programs. In other words, most programmers will need to become very good at some form of parallel programming.
Although classical parallel programming is extremely different from quantum computer programming, maybe the paradigm shift will spur interest in what else is coming down the road, i.e. quantum computer programming. I hope the classical theoretical community is ready for the forthcoming upsurge of interest in parallel algorithms. They're going to need a model better than the PRAM.
(P.S. I forgot to keep track of the blog which linked to the aforementioned paper. My apologies)
Tuesday, March 08, 2005
When I was a kid, my uncle told me a great story from his freshman year at Princeton about a brilliant physicist who gave a guest lecture on diffraction. I couldn't remember who it was or what the details of the story were, so I asked him to email me the story:
The person who gave the lecture was Eugene Wigner, who a few months earlier, in 1963, had received the Nobel Prize. We knew that Wigner was going to give the lecture and we were anticipating it. When we entered the big old lecture hall, there was a pleasant looking, older gentlemen holding the door open for us. It didn't hit us at first who that man was, but that was the way that he wanted to introduce himself to us. Damned, pure genius looked like a guy that you would expect to see sipping coffee in a Viennese cafe. Pleasant as can be, unassuming, smiling pleasantly to each person as we passed by.
There were two large blackboard spaces that had two blackboards in each space, one forward and one behind. A lecturer could write on a blackboard and then save the writing by pushing that board up and hauling the other one down to continue the writing. Wigner had a hard time handing this set up. "Where did I write that?" "How does this work?" Which one should I write on first?" Practical problems like that. There was also a string that hung down in the middle of this confusing apparatus, that could be used to manually lower the projector screen. This last complication and visual interference was particularly troublesome and a source of continual irritation as equation were developed on either side of the string, sometimes causing the equation to be divided, not at the equal sign, but at an inconvenient place in the expression.
However, Wigner was still able to focus sufficiently to do things that I have never seen before or since. We knew that Wigner had developed the math Fermi needed to develop the first nuclear reactor (the "Pile" of carbon neutron moderator, boron neutron absorber, and uranium neutron producer and gama ray - energy- generator, that would be critical, but not supercritical, which is an exponential power spike that potentially could have quickly changed the configuration and location of that nice pile that was under the sports stands at the University of Chicago)
Wigner said that he had accepted this teaching challenge though he had no recollection of exactly what the phenomenon was when light when through space occupied with matter. Never the less, he accepted the assignment. He brought a dictionary which he used to look up "Diffraction", though, as I recall, he had as much trouble coming up with the correct spelling as I just had. He read the definitions, though he had to think for a while about which definition he should us--that had the promise to lead to clarifying the phenomanon, not confuse us, and no doubt him, further. "Ah hah", in that good-humored German accent, "this is the von to use". We were all thinking OK, if you say so.
Wigner started to crank out the equations, everything with triple integrals and more Greek letters than frat row, all written in a hand that made all of them look kind of similar, all like a variation of "Q". As he approached the end of his allowable space on the fourth black board he began to slow down and a scowl crossed his face. In mid equation he stood back and said "This cannot be right. Something is wrong." His hand passed over the equations, starting back from his last equation. He went back a few equations, then he slowly stood back from the board, pointing, and said, "That's it. That is the mistake." He erased the now obvious (to him) error, changed the appropriate place in the subsequent equations, and finished the last equation. He turned to us in the audience, a slow grin broke out on his face, and he said in a quiet manner, "That makes sense."
No one said anything. We were all trying to appreciate the moment and to store it in our long-term memory. We broke out in a respectful applause, not too much so we wouldn't embarass this sensitive man. We were a classroom of freshmen engineers, so an appreciation or understanding of the material presented to us was not possible for any of us. But we could appreciate seeing how a mind in tune with nature translates observation and logic into a mathematical model
Wigner flushed a little, bowed slightly, went to the door and held it open for us as we filed out, a little dazed, heading for our next class which had no hope of holding us enthralled as Mr.Wigner had.
Tuesday, February 15, 2005
My collection of links on the right isn't very complete, but I had no idea how much was riding on it until I discovered Blog Shares, a stock market for blogs. Someone named Javier López actually owns 75% of the shares of this very blog, although I'm sure he'll eventually regret the purchase if he doesn't already. David Bacon own 20% of his own Quantum Pontiff blog, which is about to experience a modest increase in (virtual) valuation, thanks to the extremely overdue link I finally added. I wonder if blogshare money will ever have real world value. (If it sounds implausible, see EverQuest).
Some quantum or CS type blogs, and their valuations [Updated Feb-16, thanks Suresh]:
Computational Complexity Web Log: $12,029.76
Michael Nielson: $5,845.59
The Quantum Pontiff: $3,849.21
Quantum Algorithms: $3,816.68
Illuminating Science: $3,313.81
Quantum Bits: $2,709.12
Quantized Espresso: $1,664.65
Friday, January 21, 2005
Eye Opening Quote
Chakra Yadavalli dug up a very interesting quote. Check out the original post, but I'm going to steal it because it's too good not to, and also so I can make a cheap point:
Question by Dr. Manickam, Pune University: There is an effort in Europe for secure Networking based on Quantum Computing. Why not such projects be initiated in India?
Answer By [Name Withheld (For Now)]: A Quantum computer is a device that harnesses physical phenomenon unique to quantum mechanics (especially quantum interference) to realize a fundamentally new mode of information processing. Encryption, however, is only one application of a quantum computer. In addition, a researcher has put together a toolbox of mathematical operations that can only be performed on a quantum computer, many of which he used in his factorization algorithm. Currently the power and capability of a quantum computer is primarily theoretical speculation; the advent of the first fully functional quantum computer will undoubtedly bring many new and exciting applications. Quantum computing is one of the areas, where India can contribute substantially. We are now working on a nano-technology mission which can make realizable quantum computers. The Conference can debate and make suggestions on how we can bring in synergy in this crucial area.
So guess who said this: Is it the head of a physics department at a top university in India? Maybe a government minister of science and technology?
How about: Abdul Kalam, the President of India.
As Chakra Yadavallia alluded to, imagine your favorite North American (or Australian) leader answering the same question, and think about how they'd respond.
Paltry competition aside, Kalam deserves recognition in his own right for being inquisitive enough to know this much. After looking at his background, his quote is not surprising at all...Kalam was a highly successful aerospace engineer before getting mixed up in politics. According to Wikipedia, the Presidency of India is a largely symbolic role, and the Prime Minister is the true seat of power. But no matter, with such a good scientist in a prominent role, it's no wonder India is ascending as a science and technology powerhouse.
As a nation that places such a high value on democracy, education, intellectuals, and fluency in English, India deserves all the white collar outsourcing they can get. This also solidifies my thinking that Americans have no right to complain about losing high tech jobs until they (well, okay, I admit: we) can elect a leader who understands technology. I'm not talking about enacting protectionism, but I mean investing in science and technology to secure a role in the future. Shrinking science budgets won't get the job done.
Sunday, January 09, 2005
Waking Up QubitNews
QubitNews is a pretty good way to keep up with newly available positions and the occasional conference in quantum computing. What would be even better is if it became the Slashdot of quantum computing...but that can only happen if people want to contribute. There are a couple of new discussion-friendly topics, here and here.
(Ideally, my blog will become completely redundant, replaced by a user supported community which contains all the same content with the added benefit of commentary and discussion by many experts active in the field)
Saturday, January 08, 2005
October Wrap Up
-Detecting a single spin
-Quantum Register Experiment with Neutral Atoms (more here)
-Device for splitting a stream of quantum objects