Site Meter

Saturday, October 30, 2010

some generality of science

I've been on a critical direction for awhile, and looking to find a more constructive direction.
I'd like to get back to some of the excitement I've had for science. And it really covers a lot of good stuff.

So what are some of the important things we know? I think Feynman said that he thought the most astonishing and important knowledge was about atoms. Here is something like a universal knowledge. We can take anything we find, anywhere, and if we break it down in a variety of ways, we find that there are atoms that made it up. Maybe often you get molecules instead of atoms. But this idea of breaking material apart and always getting something of a fixed set of elements. This does seem to be pretty significant knowledge, and to always be true. Does this mean that atoms are the "underlying story" of everything? Maybe.

So this seems to be a kind of reductionism defined operationally. Stuff can always be broken down into atoms. (How to make it more precise? Step 1. Find something. Step 2. break off a little piece. Step 3. break off a little piece of that. Step 4. heat it up? explode it? ...)

What else do we have? We have light. There are radio waves, and visible light, and ultraviolet waves, and xrays. You can't really talk about light in the same way you talk about matter. You don't break stuff apart and find light at the bottom. Stuff goes through some transitions, and light is emitted- it makes other stuff go through transitions.

I really don't want to end up with a network here, with nodes and messages being passed between the nodes. Yuck. I'm sick of networks.

Anyway, yeah, there's stuff, and there's light.

What about a beautiful forest- an intricate ecosystem? Stuff and light? Does that get us very far in understanding and appreciating it? More work for another day.

Wednesday, October 27, 2010

research infrastructrure

In some ways, people doing research may do more work than others. You take the problem with you all the time, and you really put a lot of yourself into solving it. The benefit to this may be that you have freedom to pursue something that really interests you, and your work and passion may be aligned.

There's a danger when a field does not have a strong research culture, but still has a research component. If most of the work to be done really isn't research, then what is and isn't research may be confused. A person programming for a company doesn't think of themself as doing research, but rather as problem solving. The difference is that much of the framework is predetermined.

In a field without a strong research infrastructure, one is expected to make the framework oneself, but one will never really do something new, because the information is just badly managed. The problems have already been solved long before. If one is supposed to be doing research, but is really just catching up to where others have already been, or cleaning up old messes, this is not very healthy. Instead, this component should simply be called work, a set of objectives should be set up, and the work divided amongst those doing it.
Summary: if its work and not research, then there need to be very clear goals and it should be finished, even if imperfectly. Depth and creativity and perfection is not well spent on something that cannot support real innovation. Real innovation will look bad at first, and will take awhile to get somewhere and perhaps other people to finish things at a later time. I need to learn to separate research from work. I seem to never quite learn this lesson, and it gets me again and again.

Sunday, October 10, 2010

beam distribution, lifetime, synchrotron radiation

Ok, closer to what I should actually be working on...
We have an electron beam with a variety of interactions. At high energy in a storage ring, mainly you get a Gaussian due to damping and diffusion processes from synchrotron radiation. The self interaction mainly manifests as a beam lifetime- scattered particles are lost in what's known as the Touschek lifetime. Also, if there's an aperture close enough to this Gaussian, then particles are lost through this diffusion process in what's known as the Quantum lifetime.

Suppose we have a non-Gaussian beam. How did it get this way? What does this say about the lifetime?

We can treat the synchrotron radiation effect on the distribution via the Fokker-Planck equation. Can other noise processes on the beam also be treated via the FP equation? What would a non-Gaussian distribution do to the emitted synchrotron radiation?

Finally, does anyone care? We use this radiation for all sorts of experiments. Which experiments care about lifetime? Which care about beam size? Which care about the coherence of the radiation?

phil sci, dirac

From Taking up Spacetime is a link to the philosophy of science preprint server, here.
I've been browsing a bit and enjoying reading some papers. I found this paper on interpretations of the Dirac equation here (M. Valenti, 2008) which I've skimmed a bit. He talks about using QED to describe the Hydrogen atom, which I'd be interested to understand better. It does seem that mostly QFT calculates S-matrix type stuff, and bound states are more foreign. If NR QM and the Dirac equation really come out of QED, then it should be able to deal with bound states. I vaguely remember something about "resonances" (related to the complex poles) of the S-matrix being the bound states...
Maybe too hard to understand right now, but interesting stuff anyway.

(added... Ok, here's an interesting quote related to this reductionism, model building stuff:
In this way we are not restricted by Haag’s theorem – and so we can retain the concept of quanta in the description of interactions – because, from a physical point of view, the Lagrangian of quantum electrodynamics does not provide us (contrary to what from a mathematical abstract point of view might appear) with the possibility of describing a system of (undifferentiated) interacting Dirac and Maxwell fields, but with a way of developing models that describe in a limited way the interaction between the fields.
Valenti, p. 14)

Friday, October 08, 2010

inner space

I was complaining a little while ago that with all this network stuff, blogs, hypertext, RSS, info overload etc. that the challenging thing may be to relearn to read. I certainly noticed that I haven't been reading novels in depth in awhile, but this probably has multiple causes.

Anyway, it seems that in order to be able to really take something in, we need to spend time on it, and not be distracted. And we need to have the space for it. We need to be relaxed and not overloaded. This sounds like conventional wisdom, but it can be easy to forget. We can see the effects of reading more, but sometimes forget the value of reading less, more carefully. Allowing the words, sentences, ideas to sit there for awhile, to connect up to each other, to try to see what bigger picture the author was constructing, and to relate the text to our own experience and ideas. This is an internal process that can't be sped up by skimming, or by reading more background information, researching every last connection. I know some people who analyze texts by doing these word clouds, and finding out which words, or phrases are the most prevalent, and want to try to extract understanding from such an approach. I imagine you can get more sophisticated and start writing programs to understand things for you, and just give you some kind of summary.

It reminds me of tools in math in physics such as calculators or programming languages, compilers, etc. Here we let the machine do the work for us. Is there a benefit to doing the work ourself? I know that I learned calculus by doing many, many integrals, on paper, and learning the ins and outs of various tricks. Doing this builds up intuition and a whole workshop that connects to your other thoughts. You get a lot more out of it, than just the ability to solve a specific problem. I think in addition, you get the possibility to have a sense of which problems might be solvable, and ways in which definitions might be shifted to get useful results.
You become richer, and its more fun.

Anyway, since language and ideas are important to me and the source of lots of enjoyment, its sad to see ways in which some kinds of thought may be bypassed. My general approach is to try to make sure I can do something by myself before I get a tool that can do it better and faster.
I do use Mathematica to test out whether an integral may have an analytical solution, but I feel good that I can (or at least used to be able to!) do it myself. (Here's a page of an accelerator physicist with some quotes on this topic.)

I want to be able to read in depth, think in depth, and calculate in depth, and to take my time with it, even if someone or thing can reach the same conclusions faster or more reliably. The inner world involved, and the benefit to me as a person to this seems incalculably valuable.

Wednesday, October 06, 2010


Still reading Nancy Cartwright sometimes, and my tendency is towards criticism and thinking about why things are not as big a deal as people say they are.
But this isn't so great a way to do things/live life at some point. Actually she writes in one of her books that people tell her that her project is not very inspiring. But she does have a positive project I think, along with the criticism.
Anyway, for me, I do like physics a lot. Better to find some real questions to work on and try to make progress. The critical approach can always be there, but if its all there is, its not so productive, and probably leads to mistakes also...