Phil 4.17.17

7:00 – 8:00, 3:00 – 4:00 Research

  • This looks good: Bayesian data analysis for newcomers
  • Also this; Seeing Theory
  • I want to do a map that is based on population vs geography:
    • Gridded Population of the world (Matplotlib to generate image?). Can’t get the data directly. Need to see if available via UMBC (or maybe GLOBE?)
    • Wilbur terrain generation (installed. Will accept an image as the heightmap source)
  • Tried using QT designer, but it can’t find the web plugin?
  • Installing Python 3.6 on my home dev box
  • Downloaded all the python code, and my simulation data. I want to be able to merge tables to produce networks that can then be plotted, so I think it’s mostly going to be installing things this morning
  • NOTE: When installing Python, the only way to install for all users it to go through the advanced setup.
  • Installing packages. CMD needs to run as admin, which blows.
  • After some brief issues with the IDE not being set in structure, got all the pieces that use numpy, pandas and matplotlib running. That should be enough for table parsing (although there will be the excel reading and writing installs), though I still need to get started with graph-tool
  • Paper was rejected – time to try ACM? LaTex format. Downloaded and compiled! Now I just have to move the text over? Wrap the existing text? That’s something for tomorrow.

8:30 – 2:30, BRC

  • Working on table joins. That was pretty straightforward. Note that for column collision you have to provide a suffix. Makes me think that I want to compare across DataFrames instead
    eu.read_dataframe_excel(args.excelfile, None)
    cluster_df = eu.read_dataframe_sheet("Cluster ID")
    #  print(cluster_df)
    dist_df = eu.read_dataframe_sheet("Distance from mean center")
    #  print(dist_df)
    merged_df = cluster_df.join(other=dist_df, lsuffix='_c', rsuffix='_d')
  • So now that I can read in and analyze sheets, what am I trying to do?I think that for each time slice, and by cluster, produce a sorted list from most to least common membership.