Monthly Archives: October 2013

Dong Shin 10.07.2013

  • discussed next gen FA applications with Phil – write up to come…
  • continue working on integrating RA with FA
    • fixed all dynamic class loader references to the new package names
    • changed change logs files to RA and FA
    • testing done!
  • prepping for deployment as people are getting back to work.

Phil 10.4.13

8:00 – 4:00 FP

  • Basically banging away on the paper. Abstract, Introduction, Previous Work, Experiment are done. About halfway through results. Might finish first draft on Monday?

Dong Shin 10.04.2013

  • since we are moving away from maven, it would be good to combine FinancialAssistant and RequisitionAssistant into a single project
  • working on combining FA and RA
    • committed FA and RA
    • created new project based on FA – FinancialAssistant4
      • moving RA stuff to FA4
    • moved database specific stuff from RA to MySQLIf4
    • compiled and ran, checked in as FinancialAssistant4

Phil 10.3.13

8:00 – 9:00 SR

  • Checking to see if Lenny can come over and help, since he’s stuck at home.
  • Doing some research into the new Flash applications

9:00 – 4:00 FP

Dong Shin 10.03.2013

  • working on RA
    • creating Invoice Viewer for Requisitions – done, need to discuss what additional functionalities are needed.
    • Contract Number in Invoice Entry is TextInput now
    • added support for typing in commas and decimals
    • working on defaults table for AddableListCombo
      • addable_combobox_defaluts table
    • found a way to put log4j server logs to mysql database

Dong Shin 10.02.2013

  • keeping track of what to deploy at https://viztales.wordpress.com/to-deploy-whenever-they-are-ready/
  • working on script to update Obligations/Outlays for EA’s 
    • created and ingested test data – Contracts Example.xlsx
    • set up projects for EA’s
    • query to get contracts data mapped to EA’s
      • SELECT *
        FROM budget_centers bc
        LEFT JOIN budget_center_contracts AS bcc ON bc.uid = bcc.budget_center_id
        LEFT JOIN contracts_cognos AS cc ON cc.sub_budget_center = bcc.sub_budget_center
        AND SUBSTRING(cc.requisition_id, 5) = bcc.requisition_id
        WHERE bc.req_type = ‘EA’
    • updated obligation_outlays_queries.sql
    • updating update_obligations_outlays.py to update!
    • fixed get_cognos_outlays query to use expensed_date, not committed_date

Phil 10.2.13

8:30 – 10:30 SR

  • Working out a plan with Dong about what to do with our time. I’m guessing that the shutdown will last until at least Oct 17. The following is for the rest of this week:
    • Fix Script to add EA Cognos data plus autofill  (Obligations_outlays)
    • enter Contract Number in Invoice Entry (RA)
    • allow commas, periods in data entry (RA)
    • add Invoice Viewer in RA for the selected Req
    • Get all code up and running on FGMDEV
    • Drop maven and go to eclipse-based projects
    • Default combobox capability.
    • Create an “table_errors” table that has the application, user, date, time, query and error message, and take out the “Mail to Admin note”
  • For next week, add #include for python module storage and assembly.

10:30 – 4:30 FP

  • Finished Annotated Bibliography. Experimental design is next.

Phil 10.1.13

8:00 – 4:00 FP

  • I’m guessing this is what I charge to for a while
  • In meeting with Dr. Kuber, I brought up something that I’ve been thinking about since the weekend. The interface works, provably so. The pilot study shows that it can be used for (a) training and (b) “useful” work. If the goal is to produce “blue collar telecommuting”, then the question becomes, how do we actually achieve that? A dumb master-slave system makes very little sense for a few reasons:
    • Time lag. It may not be possible to always get a fast enough response loop to make haptics work well
    • Machine intelligence. With robots coming online like Baxter, there is certainly some level of autonomy that the on-site robot can perform. So, what’s a good human-robot synergy?
  • I’m thinking that a hybrid virtual/physical interface might be interesting.
    • The robotic workcell is constantly scanned and digitized by cameras. The data is then turned into models of the items that the robot is to work with.
    • These items are rendered locally to the operator, who manipulates the virtual objects using tight-loop haptics, 3D graphics, etc. Since (often?) the space is well known, the objects can be rendered from a library of CAD-correct parts.
    • The operator manipulates the virtual objects. The robot follows the “path” laid down by the operator. The position and behavior of the actual robot is represented in some way (ghost image, warning bar, etc). This is known as Mediated Teleoperation, and described nicely in this paper.
    • The novel part, at least as far as I can determine at this point is using mediated telepresence to train a robot in a task:
      • The operator can instruct the robot to learn some or all of a particular procedure. This probably entails setting entry, exit, and error conditions for tasks, which the operator is able to create on the local workstation.
      • It is reasonable to expect that in many cases, this sort of work will be a mix of manual control and automated behavior. For example, placing of a part may be manual, but screwing a bolt into place to a particular torque could be entirely automatic. If a robot’s behavior is made  fully autonomous, the operator needs simply to monitor the system for errors or non-optimal behavior. At that point, the operator could engage another robot and repeat the above process.
      • User interfaces that inform the operator when the robot is coming out of autonomous modes in a seamless way need to be explored.