PH 312 Lab 3

Quick and Dirty 4-D Visualization for the Masses

READ ahead before trying this lab, especially the first three sections. IMPORTANT: Answer the pre-questions before you attempt the lab.

This lab really isn't so much about physics as it is about analysing complex data sets. The data set we choose to analyse is from the lab Mathematica Simulation of a particle in a 3-D Box with Infinite Potential Walls.

You'll need the file from the lab Mathematica Simulation of a particle in a 3-D Box with Infinite Potential Walls. I think you saved it as later.ma. You'll also need to copy the file binfiles.ma to your disk or the mathematica directory.

BACKGROUND

We considered a 3-D box of dimensions L. Inside the box the potential is zero. Outside the box the potential is infinity. To visualize that data, I had you use the Scatterplot module to make a 3-D plot of the probability of finding a particle in the box for a certain state. Actually, if you think about it, this is actually four dimensional data: Probability (one dimension) versus x, y, and z (three additional dimensions). Perhaps you should look at those plots again. The plots we made presented the probability of finding the particle at location x, y, z, in the box. The density of points in the space represented the probability. The type of visualization is called a density plot or scatter plot. A high density of points indicated a high probability. Of course, there are many other visualization techniques you can use to look at 4 (and even higher) dimensional data.

One common way is to create a large four-dimensional data file is with a three dimensional matrix with its elements representing the values in the fourth dimension. (The fourth dimension must be the only dependent variable.) If you sample the volume of interest by, say, 256 points in each direction the matrix representing that data would be a 256 x 256 x 256 matrix. Then sophisticated software applications can be used to slice and dice the sampled volume. Often, with the use of colour look-up tables, the software can assign colours to the dependent dimension.

Slices in a data set are just that. Imagine taking a cube of butter and injecting it with red dye. How does the dye diffuse in the butter? One way to exam the diffusion pattern is to take slices off one side, and slide by slice, you can note the pattern the dye makes as it is uncovered. This is aptly called slicing. In a set of real data, you can look at several slices at once, and assigning the transparency of each slice, sample the data at several locations throughout the data set.

Dicing encompasses several techniques.

One is looking at surfaces of constant value. These are called isosurfaces. For the physicist, perhaps the most familiar example are the surfaces of equipotential we learn about in electromagnetism. For instance, the isosurfaces of voltage (or equipotential surfaces) about a point charge are spheres. Returning to the dye-in-the-butter example, suppose we can accurately determine the concentration of the 0.1 grams of dye per cm3 with your eyes. You take a scalpel and carefully carve away the butter along the surface the has that concentration. You are left with, beside a large mess, the isosurface of 0.1 grams of dye per cm3 the diffused in the butter. In terms of real data, you can 'draw' several isosurfaces representing different values of the data and assign the transparency of each. This way one can see into the center of the data.

Another techniuque, commonly called alpha-channel dicing makes use of the colour channels (the red-channel, green-channel, and blue-channel) and the transparency factor or alpha-channel. In this method, one edits the colour tables so they fit only in values of interest. For instance, in computing the probability of finding a particle in a 3-D quantum well, one is not especially interested in probabilities between about 0.0 and 0.4. One edits the colour tables such that, for instance, red-channel peaks at 1.0 and falls off to zero at 0.7. The green-channel starts at 0.8, peaks at 0.6, and returns to zero value at 0.5. The blue channel begins at 0.7, rises to peak at 0.5 and falls to zero again at 0.4. The alpha-channel is edited so high probability has a high alpha-value so it is opaque. The alpha-channel may fall linearly to 0.5 so that blue and green (low probability) are transparent. This is an oversimplified example. Editing the colour tables is not difficult, but usually requires much trial and error to create an easy-to-interpret visualization of the data.

Projections of the data onto a surface (flat or otherwise) are also possible. Projections don't really fall into either the slice or dice category. The way to think of a projection is to consider nested glass bowls held between an overhead projector and a screen. One sees the projection of the surfaces of the bowls on the screen. By editing the alpha channel and colour tables (see the alpha-channel dicing example), one can create surfaces (isosurfaces or a continuum of surfaces) with different 'optical' properties. A simulated white light source is applied to the data volume and projected onto a screen. The white light projection can be programmed such that the light picks up the dominate colour it encounters along each path taken in the volume.

Of course, moving animations can be made from slices or dices. For instance, one could create slices in the y-z plane in intervals along the x-axis. Then linking the slices in an animation, one my see the slices in succession. Moreover, if the interval between slices are small, the animation would show the data in the y-z plane as successive shavings are taken off along the x-axis. Another example is to create isosurfaces and then render the volume at successively different viewing angles. These different views can be linked together to form a rotating isosurface, a flyby of the surface, or even a journey around and through the surface. The ways that the techniques can be combined into an animation are only limited by the imagination and the available computer time.

LEARNING ABOUT CREATING A DATA FILE

I have provided you with a module called binfiles.ma. This handly little module provides the tools to take single streams of bytes and concatenate them into a binary file. The idea is that you calculate a portion of larger computation, save the results in the file, and then compute more chunks. The module contains the tools needed to create large binary files. The function

IncrementalScaleTool[1,255,globalmin,globalmax,data]

takes a list in a variable data which has values in the range between globalmin and globalmax and rescales it to the range 1 to 255. The result is to create a list of 8-bit bytes. This way, when the list is saved, it will be a standard 8-bit binary file. Since the final computation will contain a great number of values, its important to keep the files small. A binary file is about as compressed as you can get (without fancy compression software). The other tool is the function

IncrementallyBuildBinaryFile[filename, data, chunknumber]

which takes the variable data, which is expected to have the 8-bit binary range of 1 to 255, and writes it to a filename called filename. The chunknumber must be set to 1 the first time that the tool is called so that the function first creates the file. The function returns the value of chunknumber + 1 so that its easy to call this function repeatedly.

EXAMPLE OF CREATING A LARGE BINARY DATA FILE

The except below computes the function psi2 in a volume layer-by-layer, saving them after each layer is complete before moving down to the next layer. Specifically, psi2 is computed in the y-z-plane for 52 values of x. The results are saved. Then, the plane is moved one increment in y and, again, psi2 is computed in the new y-z-plane for 52 values of x. This is repeated 52 times until all values in the x-y-plane are computed for one values of z. This whole procedure is repeated for 52 increments in z.

<<binfiles.ma

SetDirectory["c:/"];

filename = "well111.dat";

numberofsteps = 51;

L = 10;

globalmax = N[0.01];

globalmin = 0;

ixmin = 0

ixmax = L

ixstep = (ixmax-ixmin)/numberofsteps

iymin = 0;

iymax = L;

iystep = (iymax-iymin)/numberofsteps;

izmin = 0;

izmax = L;

izstep = (izmax-izmin)/numberofsteps;


chunknumber = 1;

totalnumberofpoints = (numberofsteps+1)^3;

Print[ "Estimate of Binary Data File Size: ",totalnumberofpoints];

begintime = AbsoluteTime[];

computation = 0;

Do[

Do[

data = Table[ 0, {ix,ixmin,ixmax,ixstep}];

xvals = Table[ ix, {ix,ixmin,ixmax,ixstep}];

i = 1;

Do[

data[[i]] = N[psi2[ xvals[[i]],iy,iz,nx,ny,nz, norm] ];

computation = computation + 1,

{i,1,Length[data];}];

data=Flatten[data];

data = IncrementalScaleTool[1,255,globalmin,globalmax,data];

Do[ If[ Not[NumberQ[data[[i]] ] ],

Print[" data is not numerical at x = ",ix," y = ",iy," z = ",iz," its ",data[[i]]," !."]],

{i,1,Length[data]}];



chunknumber = IncrementallyBuildBinaryFile[filename, data, chunknumber],

{iy,iymin,iymax,iystep}];

Print[ N[ Length[data]^2 * ((iz-izmin)/izstep+1) /totalnumberofpoints * 100],

" % Done, Elapsed Time (Seconds): ", AbsoluteTime[] - begintime ],

{iz,izmin,izmax,izstep}]

Print[ "Actual number of Expected File Length: ", (chunknumber - 1) * (Length[data]),

" Total Elapsed Time (Seconds): ", AbsoluteTime[] - begintime ];

Some explanations:

The statements

Print[ "Estimate of Binary Data File Size: ",totalnumberofpoints];

begintime = AbsoluteTime[];

will print out an estimate the final file's size and start a timer, while the statements

Print[ N[ Length[data]^2 * ((iz-izmin)/izstep+1) /totalnumberofpoints * 100],

" % Done, Elapsed Time (Seconds): ", AbsoluteTime[] - begintime ],

{iz,izmin,izmax,izstep}]

Print[ "Actual number of Expected File Length: ", (chunknumber - 1) * (Length[data]),

" Total Elapsed Time (Seconds): ", AbsoluteTime[] - begintime ];

will print out the elapsed time and % done as the computation proceeds and the actual final file size and total elapsed time for the complete calculation.

Just in case the variable data isn't a list (a vector) the statement

data=Flatten[data];

forces it to be so. As a precaution the statement

Do[ If[ Not[NumberQ[data[[i]] ] ],

Print[" data is not numerical at x = ",ix," y = ",iy," z = ",iz," its ",data[[i]]," !."]],

{i,1,Length[data]}];

will let the user know if a data point is not numerical. This will help troubleshoot problems and warn you of errors.

Pre-Questions:

0) You might have to refer to the tutorials to remember DO loops. Explain what the following DO loop does. If you aren't not sure of your answer, try it.

Do[

Print[ (i + 1)^2 ],

{i, 0, 10}]

  1. From the excerpt given above, explain what the statements data = Table[ 0, {ix,ixmin,ixmax,ixstep}]; and xvals = Table[ ix, {ix,ixmin,ixmax,ixstep}]; do? What about the staement data[[i]] = N[psi2[ xvals[[i]],iy,iz,nx,ny,nz, norm] ]; ?
  2. Considering the answer to the previous questions, if you were to execute the cell with the statement data[[31]] , what would it return? Explain in detail what this statement is asking mathemetica to do.
  3. There are four DO loops in this excerpt. What does each DO loop do? Write a couple of sentences explaining what is going in inside these Do loops. Use the excerpt from the section above to base the argument on.
  4. If you set numberofsteps to 101, how many total points will be in the complete computation?
  5. If a represents the width of a infinite square well and it equals 100 and you set numberofsteps = 101, what will be the interval in x between computations of psi2 be (in the y-z plane)?
  6. How would you change the above example if you wanted the interval between z computations of psi2 to be half as large and keep the x and y intervals the same?

THE REAL WORK

This document is on the web and you are welcome to copy the Mathematica statements and paste them to your own notebook. You will, however, need to make some slight modifications. This should save time and angst.

  1. Open the file called later.ma. In the last step of that lab, I had you use the Scatterplot module to make a 3-D plot of the probability of finding a particle in the box for several states. Delete any plots, if any remain, to conserve memory. Delete the cells containing

<<Sctrplot.ma

and

stcplt = ScatterPlot[1,1,1,0,L,10,10000,1000];

Save your work!

  1. Be sure you still have definitions for psi2. Remember, it should look like this excerpt:

psi2[x_,y_,z_,nx_,ny_,nz_, norm_] : = 1/norm ...

At the end of later.ma, add a cell that will change the directory to your directory and read in the binfiles.ma notebook. Here, I'm assuming that the disk and directory that binfiles.ma is c:/yourdirectory. This will also be where your data file will be saved to. Please edit c:/yourdirectory so that it points to the disk you want it to (good choices are your student directory or a floppy (a:/). Be sure that the disk you choose has at least 1 MB of free space. Note: you might want to change the filename if to match the quantum numbers you choose. Try not to overwrite any files you want to keep!

SetDirectory["c:/yourdirectory"];

<<binfiles.ma

filename = "well1.dat";

  1. Add another cell defines the number of data points that will be taken. If you set numberofsteps to 51, the total number of points in the computation will be 140608. This is a modest to small sized file for this type of data analysis. Copy all these statements into a file. Note that globalmax is set to 0.01. That means we are expecting that the maximum value that psi2 will ever have is 0.01. This might not be a good value to choose. Please check your psi2 to see that this is a good choice.

numberofsteps = 51;

globalmax = N[0.01];

globalmin = 0;

L = 10;

ixmin = 0

ixmax = L

ixstep = (ixmax-ixmin)/numberofsteps

iymin = 0;

iymax = L;

iystep = (iymax-iymin)/numberofsteps;

izmin = 0;

izmax = L;

izstep = (izmax-izmin)/numberofsteps;

  1. Copy these statements into another cell. Hopefully, you understand each statements from answering the pre-questions.

chunknumber = 1;

totalnumberofpoints = (numberofsteps+1)^3;

Print[ "Estimate of Binary Data File Size: ",totalnumberofpoints];

begintime = AbsoluteTime[];

computation = 0;

Do[

Do[ data = Table[ 0, {ix,ixmin,ixmax,ixstep}];

xlen = Length[data];

xvals = Table[ ix, {ix,ixmin,ixmax,ixstep}];

i = 1;

Do[ data[[i]] = N[psi2[ xvals[[i]],iy,iz,nx,ny,nz, norm] ];

computation = computation + 1,

{i,1,xlen}];

data=Flatten[data];

(*

Print["computation # ",computation, " Length of data: ",Length[data], " xvals,iy,iz: ", xvals," ",iy," ",iz," ",

data];

*)

If[ Length[data] != numberofsteps+1, Print["WARNING!!!!!!!!!!! Length[data] is ",

Length[data]," and numberofsteps+1 is ",numberofsteps+1]];


Do[ If[ Not[NumberQ[data[[i]] ] ],

Print[" data is not numerical at x = ",i," y = ",iy," z = ",iz," its ",data[[i]]," !."]],

{i,1,Length[data]}];

data = IncrementalScaleTool[1,255,globalmin,globalmax,data];

chunknumber = IncrementallyBuildBinaryFile[filename, data, chunknumber],

(*

Print["After save: computation # ", computation, " Length of data ",Length[data],

" xvals,iy,iz: ", xvals," ",iy," ",iz," ",BaseForm[data,16]],

*)

{iy,iymin,iymax,iystep}];

Print[ N[ Length[data]^2 * ((iz-izmin)/izstep+1) /totalnumberofpoints * 100],

" % Done, Elapsed Time (Seconds): ", AbsoluteTime[] - begintime ],

{iz,izmin,izmax,izstep}]




Print[ "Actual number of Expected File Length: ", (chunknumber - 1) * (Length[data]),

" Total Elapsed Time (Seconds): ", AbsoluteTime[] - begintime ];

Save your work!

CREATING DATA FILES

  1. We will begin by creating a file with the quantum numbers nx = 1, ny = 1, nz = 1. Execute the cells and, with luck, a data file will be created on the disk and directory you specified. Check to see that the file created has the same size as the computation indicate. This is best done using the DOS prompt (available from the start menu).
  2. Divide the plots of various states among others in the class. Be sure to change the filename to something appropriate. You should try one higher state, although I wouldn't try higher states than (4,4,4). It becomes a little confusing beyond that state. When you are done editing. Save your work (again). Highlight all the new cells and execute them. If they don't complete by the end of class, I'll take them to my server and execute them for a day and see the results later.

4-D VISUALIZAION USING SLICER

  1. I'll show you how to run Slicer. Once you have an instance running (and files copied to the right place, if necessary) open your data file that represents psi2 (the probability of finding the particle) for (nx, ny, nz) = (1, 1, 1). Change the filetype to all files (*.*), navigate around the directory structure and find your file, and hit 'OK'. A new window opens up asking for some more information. The data type should be unsigned byte, the number of bytes to skip, skip bytes, is zero. The layers( z), columns (y), rows(x) need to be equal to numberofsteps+1 in the notebook you used to make the file. Set each of these parameters to that value.

Next, hit the survey button. Slicer will computer the maximum and minimum values in the entire file. If there is something wrong in the file, hitting this button will probably find it. If the survey process is successful and if the min and max values are about 0 and 240, you are in great shape. Press the 'OK' button.

  1. Cool, huh? You see the entire volume you computed - the 3-D well is a box! Right now, however, you can't really see much of the data. Maximize the windows. To update (redraw) the window, click on the camera icon.
  2. Let's create some slices. From the menu bar, go view, slices. Create a slice in the y-z plane by clicking on X PLANE, which creates a planar slice along the x-axis at the value given by current slice: offset. Lets create one in the center at offset = 0 (the default). Click OK. Deselect the volume visualization icon, the cube, and select the slice visualization icon, the three planes. Click the redraw (camera) icon to see the results.
  3. Create two more planes in the Y PLANE and Z PLANE with zero offsets (in the center of the volume). Look them over.
  4. Answer these questions:
    1. When using the default colour table as we are now, larger probabilities are represented by red and orange, mid-range values are yellow and green, and smaller probabilities are blue and violet.
    2. Where is it most likely to find the particle?
    3. Are there more than one place that it is highly likely to find the particle in a measurement?
    4. What kind of symmetry does the nx = ny = nz = 1 probability distribution have?
    5. OPTIONAL - You can change the colour scheme in colour tables, but if you do , please return it to rainbow before proceeding. You'll need to redraw the screen before results are realized.
  5. You may rotate the image by placing the mouse cursor over the image, holding down the left mouse button, and moving about. Its a bit tricky to get exactly what you what. When you have a new view, release the mouse button and redraw the volume. You can also manually enter in the viewing position by going to the menu bar's options, set view position. When you are doneplaying eith different views, return to the default view by setting the angles to: x = 45, y = 22.5, z = 90 degrees (translations to zero and scale to 0.5).
  6. Now let's experiment with isosurfaces. Go to the menu bar's view, isosurfaces. A histogram appears on the left side showing the number of values in the data set versus the values. Since most of the volume is comprised of low probabilities (violet), you see there are more low values represented on the histogram. In fact, you can barely see the important higher probabilities. Use the sliding bar (far left) to emphase the higher (redder) probabilities. Create an isosurface by hitting the new surface button. The new surface flashes. The current value is shown at the upper right. Change this value to 200, click on okay, find the slices icon (the 3 parallel planes) and click on it to turn it off, then turn on the view isosurfaces icon (the sphere), and redraw the volume. What appears is the surface for which the probability is 200 (in the range 1 to 255, remember).
  7. Add a new surface following the previous step, however make its value 180. When you redraw the volume you see the 180 isosurface.
  8. What happened to the 200 isosurface?!#*&^@!!!! Its hidden by the opaque 180 isosurface. To see both, we need to see through the 180 isosurface. This can be accomplished by the opacity on the isosurface menu. Return to this menu and change opacity of the 180 isosurface to 0.5 (highlight it on the little surfaces defined list at the center right of the menu before monkeying with it). Don't forget the changes won't be seen until you redraw the screen.
  9. Analysing the data is better done if you look at a number of isosurfaces. Return to the isosurfaces menu and delete all isosurfaces. Click on the auto... button to define a number of isosurfaces. Create 6 surfaces, between 50 and 200. Then change their opacity as described in the table, below:

Value
Opacity
50.000.05
80.000.08
110.000.10
140.000.20
170.000.30
200.01.00

Of course, you can choose your own opacity values, but I find this choice to be fairly good.

  1. Now can you see symmetry? Describe in you report the volume, what the isosurfaces' values and spacing are (in data values and in physical distance in x, y, z), and what this visualization tells you about a particle in this state.
  2. Now is a good time to mention the shadow that the volume casts on the 'wall' of the 'room' containing the data volume. The room and data volume are lit by a single point light source. We can change the location of this light source. Go to the menu bar's options, lighting options. A circle appears. It represents a hemisphere which you can think of as a dome resting on the surface of the monitor. You can move the light (the black square) around to change the direction of illumination of the volume. The dashed circle represents where the light would be if 45 degrees from the top of the hemisphere. Try placing the light source about one black square's width away from the dashed circle (on the inside) and on the horizontal line on the right side and redrawing the screen. What visualization technique is this called?
  3. Its easy enough to combine techniques. Try turning on the slices and isosurfaces icons at the same time.
  4. The last techniuque, alpha-channel dicing, is pretty tricky, however, it often offers the best results if you have the time for much trial and error. To save time, I've created a colour table which works all right for this data. I didn't spend much time on it, so its just not optimal. Find the colour table file well_111.cmp and copy it to your favourite directory or floppy (use the file manager or windows explorer). In Slicer, from the menu bar, go view, edit colour table and open (click on the floppy icon) the file well_111.cmp. Try to deduce what the little graphic plot is telling you (refer back to the background section).
  5. Click okay from the edit colour table menu, click on the slices and isosurfaces icons to turn them off, click on the volume icon (the cube), and redraw the screen. As I mentioned, this isn't an optimal colour table. If you are feeling adventurous, edit the colour table yourself.

ANALYSING YOUR OWN DATA

  1. Open the data ( .dat) file you created in step 6 using the above section as a guide. Then analyse it, again using the above section to assist you. Write a summary of you analysis for your report and to share with the class.
  2. Finally, as a class, we will discuss the analyses and arrive at an understanding of the quantum numbers and how they affect the probability of locating a particle in a measurement. In particular, you should readdress the questions posed in step 11. Take notes! Summarize the class discussion in your report.