Category Archives: Server

Phil 8.13.13

8:00 – 10:00 SR

  • Backups
  • There may be a problem where some of the data is disappearing after being entered in the VizTool? Overheard a conversation, but didn’t follow up.

10:00 – 4:30 FP

  • Got the wiring cleaned up.
  • Integrating collision response with the targetSphere. The math is looking reasonably good.
  • Added multi-target capability.
  • Added adjustable sensitivity to the pressure sensors. The pushing directly on the speakers is causing artifacts. I think I need to build small c-section angle that decouples the squeezing force from the Vibroacoustic feedback.

Phil 8.9.13

8:00 – 10:00 SR

  • Backups
  • Desk issues
  • Looks like someone might be interested in Visibility pretty much out of the blue. We’ll see

10:00 – 5:00 FP

  • Wired up the speakers to the old grip interface. Amp and speakers work nicely
  • Now to work on the force moving the gripper.
  • I also need to get the target force vector back so that I know how to move the target. Not looking forward to implementing friction…
  • Hmm. Before I commit to doing it right, I’m wondering if I can look at the summed magnitude of the force vectors vs the magnitude of the summed force vectors. If the first is high and the second is low, then it means that the ball is being squeezed. Above a certain ratio, The target could simply be at the center of the sensor points.
  • Assuming that the above will work, I need to do the following to targets
    • An awareness of a “floor” that provides either a limit to motion or an upward force once penetrated
    • An awareness of the forces being applied and the sources of the forces. Since these are spheres with given radius, the information to be tracked could be as simple as the center positions of the items
    • Gravity and mass?

Phil 8.8.13

8:00 – 10:00 SR

  • Backups
  • Deployed new FA, which seems to be working well
  • Met Pat.

10:00 – 5:00 FP

  • Building a two-fingered gripper
  • Going to add sound class to SimpleSphere so that we know what sounds are coming from what collision. Didn’t do that’ but I’m associating the sounds by index, which is good enough for now
  • Need to calculate individual forces for each sphere in the Phantom and return them. Done. To keep the oscillations at a minimum, I’m passing the offsets from the origin. That way the loop uses the device position as the basis for calculations within the haptic loop.
  • Here’s the result of today’s work: 

Phil 8.7.13

8:00 – 11:30 SR

  • Backups
  • Training

11:30 – 4:00 FP

  • Basically spent the whole day figuring out how the 4×4 phantom matrix equates to the rendering matrix (I would have said OpenGL, but that’s not true anymore. I am using the lovely math libraries from the OpenGL SuperBible 5th Edition, which makes it kinda look like the OGL of Yore. Initially I thought I’d just use the vector components of the rotation 3×3 from the Phantom to get the orientation of the tip, but for some reason, parts of the matrix appear inverted. So instead of using them directly, I multiply the modelviewmatrix by the phantom matrix, Amazingly, this works perfectly. To make sure that this works, I rendered a sphere at the +X, +Y and +Z axis in the local coordinate frame. Everything tracks. So now I can create my gripper class and get the positions of the end effectors from the class. And since the position is in the global coordinate frame, it kind of comes along for free,
  • Here’s a picture of everything working:
  • PhantomAxis
  • Tomorrow, I’ll build the gripper class and start feeding that to the Phantom. THe issue will be to sum the force vectors from all the end effectors in a reasonable way.

Phil 8.6.13

8:00 – 11:00 SR

  • Backups
  • Group meeting
  • Spent some time chasing down the duplicated bug. The decision is to truncate the contracts_cognos table before ingest. There is still a problem where multiple rows are not being summed even though they are claimed (where contract numbers are NULL)

11:00 – 4:00 FP

  • Integrating all the pieces into one test platform. The test could be to move a collection of physically-based spheres (easy collision detect) from one area to another. Time would be recorded from the indication of a start and stop (spacebar, something in the sim, etc). Variations would be:
    • Open loop: Measure position and pressure, but no feedback
    • Force Feedback (Phantom) only
    • Vibrotactile feedback only
    • Both feedbacks
  • Probably only use two actuators for the simplicity of the test rig. It would bean that I could use the laptop’s headphone output. Need to test this by wiring up the actuators to a micro stereo plug. Radio Shack tonight.
  • Got two-way communication running between Phantom and sim.
  • Have force magnitude adjusting a volume.
  • Added a SimpleSphere class for most of the testing.

Phil 8.5.13

8:00 – 10:00

  • Backups
  • Status Report
  • Helped Dong with column spacing for query returns
  • Burned a disk with the new deployable and status report.

FP 10:00 – 4:00

  • This could be interesting: Indoor Location Estimation Using Visible Light Communication and Image Sensors
  • Worked on the shared memory system. Data is now passed robustly between the Phantom app and the sim.

Phil 8.2.13

8:00 – 10:00 SR

  • Backups
  • Sent a note to the system security folks asking to clarify what I was responsible for installing on my virtual server.
  • Need to do a status report for Tangie
  • Got tooltips for RA/FA
  • Meeting with Chris and Lenny. Mostly we discussed the issue of bad data in Cognos.

10:00 – 4:00

  • Integrating Phantom
  • Code is in and compiling, but there are mysterious errors:
  • HD_errors
  • HD_errors2
  • I think I need a more robust startup. Looking at more examples….
  • Hmm. After looking at other examples, the HD_TIMER_ERROR  problem appears to crop up for anything more than trivially complex. Since both programs seem to run just fine by themselves, I’m going to make two separate executables that communicate using Named Shared Memory. Uglier than I wanted, but not terrible.
  • Created a new project, KF_Phantom to hold the Phantom code
  • Stripped out all the Phantom (OpenHaptics) references from the KF_Virtual_Hand_3 project;
  • Added shared memory to KF_Phantom and tested it by creating a publisher and subscriber class within the program. It all works inside the program. Next will be to add the class to the KF_VirtualHand project (same code, I’m guessing? Not sure if MSVC is smart enough to share). Then we can see if it works there. If it does, then it’ll be time to start getting the full interaction running. And since the data transfer is essentially memcpy, I can pass communication objects around.

Phil 8.1.13

8:00 – 10:00 SR

  • Backups
  • Need to write up monthly status reports for Tangie

10:00 – 4:30 FP

  • Continue integrating Phantom into testbed
  • Need to bring headphones back for microphone
  • Spent most of the day trying to figure out how to deal with geometry that has to be available to both the haptic and graphics subsystems. The haptics subsystem has to run fast – about 1000hz and gets its own callback-based loop from the HD haptic libraries. The graphics run as fast as they can, but they get bogged down. So the idea for the day was to structure the code so that a stable geometry patch can be downloaded from the main system to the haptics subsystem. I’m thinking that they could be really simple, maybe just a plane and a concave/convex surface. I started by creating a BaseGeometryPatch class that takes care of all the basic setup and implements a sphere patch model. Other inheriting classes simple override the patchCalc() method and everything should work just fine. I also built a really simple test main loop that runs at various rates using Sleep(). The sphere is nice and stable regardless of the main loop update rate, though the transitions as the positionis updated can be a little sudden. It may make sense to add some interpolation rather than just jumping to the next position. But it works. The next thing will be to make the sphere work as a convex shape by providing either a flag or using a negative length. Once that’s done (with a possible detour into interpolation), I’ll try adding it to the graphics code. In the meanwhile, here’s a video of a dancing Phantom for your viewing pleasure:

Phil 7.31.13

8:00 – 10:00 SR

  • Backups
  • Got the development environment including Mike’s old workspace set up on the integration machine
  • Was able to run the PKI test code and get good results. Next is to attach to the server and step through the filter to see what’s going on.

10:00 – FP

  • Finish getting the Phidgets code working in KF_Hand_3 – done
  • Start to add sound classes – done inasmuch as sounds are loaded and played using the library I wrote. More detail will come later.
  • Start to integrate Phantom. Got HelloHapticDevice2 up and running again, as well as quite a few demos

Phil 7.30.13

8:00 – 10:00 SR

  • Backups
  • Tried to do some training, but the servers were undergoing maintenance.

10:00 – FP

  • Brought in my fine collection of jumpers and connectors. Next time I won’t have to build a jumper cable…
  • Built the framework for the new hand test. The basic graphics are running
  • Added cube code to the FltkShaderSupport library
  • Next, I’m going to integrate the Phidget sensor code into the framework, then hook that up to sound code.
  • Had Dong register for Google’s Ingress, just to see what’s going on.
  • Loaded in the Phidgets example code and the library that works is the x86 library. Using the 64bit library results in unresolved externals errors.
  • There are a lot of straight C examples. Just found the C++ class examples simple.h and simple.cpp.

Phil 7.29.13

SR 8:30 – 10:30

  • Backups
  • Found the source of the doubleing and tripling bug. New lines for the same values are being ingested. There needs to be a conflict resolution screen that is part of the ingest process.
  • Need to escape reserverd characters from query builder
  • Need to ensure that reserved words aren’t used in export to viztool.

FP 10:30 –

  • Prepping for demo
  • Fixing cable that broke on Friday
  • Got a lot of new parts in over the weekend including probes, cable, and sensors
  • Nice meeting with Dave Coleman. No concrete results (more meeting[s] required), but I think it went well.

Phil 7.26.13

SR 8:00 – 10:00

  • Found some query builder bugs
    • “GROUP BY” is getting stripped out
    • Two requisition ID’s is confiusing
    • If you select new items after an modifier, incorrect SQL is generated
    • Need to be able to group similar rows (i.e. roll up all rows with the same req id)
  • There seems to be a math error on the “Budget Information” panel. Lenny found that on project 236, there are three EA lines (and nothing else). One line is correct, one line is doubled, and one line is tripled.
  • Backups
  • Installed Eclipse 4.2 and Mikes old workspace on the server.

FP 10:00- 4:00

  • Today, the goal is to build a circuit with three channels that connects to the Phidgets voltage sensor. The only thing I’m wondering is if I’ll get the resolution with the voltage range I’m getting – Zero to about 2.5 volts. I’m estimating that I should get about 1500 – 3000 steps out of that, assuming -30v to +30v is resolved to a (unsigned?) 16-bit int.
  • Done!

ratsnest

Phil 7.25.13

8:00 – 10:00 SR

  • Backups
  • Deploying new swfs

10:00 – 5:00 FP

  • Got the sensor resistance converted to voltage. A 1k ohm resistor seems to work best, since I want most of the sensitivity to be light pressure. Next, build a circuit with three channels that connects to the Phidgets voltage sensor. The only thing I’m wondering is if I’ll get the resolution with the voltage range I’m getting – Zero to about 2.5 volts. I’m estimating that I should get about 1500 – 3000 steps out of that, assuming -30v to +30v is resolved to a (unsigned?) 16-bit int.

Phil 7.24.13

8:00 – 10:00SR

  • Backups
  • Ran the query_logs to Viz query with the column names changed. It works like a charm
  • Added the new jars to VSS for better logging

10:00 – 4:00 FP

  • Organized the lab. Or at least a good start.
  • Hooked up one of the pressure sensors. The question is should I use a Wheatstone bridge or a pullup resistor? Sparkfun suggests a pullup resistor.
  • Picked up the power supply, crimping tool and wire

Phil 7.22.13

8:00 – 2:00 SR

  • Uploaded new VSS jars to the server to see if I can track down the problem with large(?) files.
  • Backups
  • Added more logging, but it turns out that the problem was in the creation of the table with reserved column names (“index”, etc);
  • Checked in scriptengine and JavaUtils. Also created an “Eclipse Server Projects” and checked the Vis3Testbed and VSSTestbed projects into it.

2:00 – 4:30 FP

Brian came over last night and we were able to load up his laptop with the drivers and software and run examples. At this point, we’re looking at three things to study with the rig:

  1. What is/are the best frequencies for spatial orientation (position and distance) using these actuators?
  2. What happens with speed and accuracy when there are more than one signal?
  3. Do 7 actuators work better than 4?

We’re in the process now of writing up the test plan. In the meantime, I’m now going to try to adapt the Audio2X software to replace the synthesizers and use a Phidgets 1019 to replace the Ardino of the previous telepresence test rig. Once that’s done, then I’ll add in the Phantom. For reference, here’s a video of the first version running:

And here’s a picture of the latest interface that will be attached to the Phantom:

IMG_1444To move this along, I’ve brought a small pile of electronics in from home. Tomorrow’s goal is to make sure that I can connect to the interface board. Once that’s working I need to hook up the sensors from the old prototype (note! Bring in crimping pins for female DB15 connector!).