Category Archives: Uncategorized

Phil 1.14.16

7:00 – 4:00 VTX

  • Good Meeting with Thom Lieb
    • Here’s a good checklist for reporting on different types of stories: http://www.sbcc.edu/journalism/manual/checklist/index.php
    • Ordered Melvin Mencher’s News Reporting and Writing
    • Discussed Chatbots, fashion in technology, and NewsTrust, a fact-checking site that piloted out of Baltimore in 2011. This post explains why it wound up folding. Important note: Tie into social media for inputs and outputs!!!
  • Added Communication Power and Counter-power in the Network Society to the corpus
  • Manuel Castells is the author of the above. Really clear thinking. Added another paper, The Future of Journalism: Networked Journalism
  • Had an interesting chat with an ex-cop about trustworthiness. He’s a fan of the Reid Technique and had a bunch of perspectives that I hadn’t considered. Looking for applications to text, I came across this, which looks potentially relevant: Eliciting Information and Detecting Lies in Intelligence Interviewing: An Overview Of Recent Research
  • Todd Schneider analyzes big data in interesting posts on his blog.
  • Chapter 7 Using Queries
    • JPQL
    • Totally digging the @NamedQuery annotation.
    • How to paginate a result:
      int pageSize = 15;
      int maxPages = 10;
      for(int curPage = 0; curPage < maxPages; ++curPage){
          List l = nt.runRawPagedQuery(GuidBase.class, curPage, pageSize, "SELECT gb.id, gb.name, gb.guid FROM guid_base gb");
          if(l == null || l.size() == 0){
              break;
          }else{
              System.out.println("Batch ["+curPage+"]");
              nt.printListContents(l);
          }
          System.out.println();
      }
    • Stopping at Queries and Uncommitted Changes, in case my computer is rebooted under me tonight.

Phil 1.5.16

7:00 – 4:30 VTX

  • Working my way through / getting familiar with AtlasTi. I’ll have two papers in by this afternoon, so I should be able to try some quantitative taxonomy extraction.
  • Since I got the drillDownAlias() method running yesterday, I’m going to try setting up the various queries for the networks, dictionaries and users. That seems to be working nicely.
  • Added test queries for BaseUser, BaseDictionary and BaseNetwork. While doing this, I realized that I had not set up mapping from the dictionary to the entries and fixed that.
  • Need to see how we’re going to do CRUD actions on these structures.
  • Wrote the deduplicate methods for Aaron.

Phil 1.4.16

7:00 – 2:30 VTX

  • Got my Copy of AtlasTi. Going to try using it to organize my papers/thoughts for the proposal. Imported a bunch of papers. Next, I’m going to re-do my annotations of the Gezi paper into Atlas and then see if I can start to cross-correlate, code and so forth. After that’ we’ll try some fancy things like getting eigenvectors out of taxonomies.
  • Realized that I should be able to automate Hibernate criteria so that a query like
    • Criteria criteria = drilldown(session, Showroom.customers, LIKE, ‘Aaron’) should be possible.
  • But before that, I’m going to try out spring JPA and Intellij spring / springboot integration.
  • Replicated the hibernate sandbox (SpringHibernate1) using spring. not really sure what it gave me yet.
  • Adding in JPA support in the IDE
  • Still some missing jars. Since I can’s think of any other way to do it, grabbing the jars as needed from Maven.
  • Ok, I think I got everything in, but it blows up:
    [2016-01-04 11:18:13.409] - 3116 INFO [main] --- com.philfeldman.mains.SpringJPATest: Starting SpringJPATest on PFELDMAN-NCS with PID 3116 (C:\Development\Sandboxes\SpringHibernate1\out\production\SpringHibernate1 started by philip.feldman in C:\Development\Sandboxes\SpringHibernate1)
    [2016-01-04 11:18:13.428] - 3116 INFO [main] --- com.philfeldman.mains.SpringJPATest: No active profile set, falling back to default profiles: default
    [2016-01-04 11:18:13.476] - 3116 INFO [main] --- org.springframework.context.annotation.AnnotationConfigApplicationContext: Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@6321e813: startup date [Mon Jan 04 11:18:13 EST 2016]; root of context hierarchy
    [2016-01-04 11:18:14.504] - 3116 INFO [main] --- org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor: JSR-330 'javax.inject.Inject' annotation found and supported for autowiring
    [2016-01-04 11:18:14.577] - 3116 WARNING [main] --- org.springframework.context.annotation.AnnotationConfigApplicationContext: Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: private javax.sql.DataSource org.springframework.boot.autoconfigure.orm.jpa.JpaBaseConfiguration.dataSource; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [javax.sql.DataSource] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {@org.springframework.beans.factory.annotation.Autowired(required=true)}
    [2016-01-04 11:18:14.588] - 3116 SEVERE [main] --- org.springframework.boot.SpringApplication: Application startup failed
    org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: private javax.sql.DataSource org.springframework.boot.autoconfigure.orm.jpa.JpaBaseConfiguration.dataSource; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [javax.sql.DataSource] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {@org.springframework.beans.factory.annotation.Autowired(required=true)}
  • Taking a break on the Spring JPA to add in the ability to drill down to a class element with hibernate. This really isn’t provided somewhere?
    /**
     * For some reason, hibernate can't create a nested alias. This loops over the path to create one.
     * @param rootClass - The root class that we are going to query
     * @param leafNodeName - the path to the node we wan't to restrict on (e.g. "Foo.bar.baz").
     * @return - A Criteria if successful, null if not.
     */
    public Criteria drillDownAlias(Class rootClass, String leafNodeName){
        String className = rootClass.getSimpleName();
        System.out.println("Class name = "+className);
    
        String[] nodeNames = leafNodeName.split("\\.");
    
        if(nodeNames.length < 1){
            return null;
        }
        Criteria criteria = session.createCriteria(rootClass, nodeNames[0]);
    
        // TODO: add some testing that verifies the path is valid
        for(int i = 1; i < nodeNames.length; ++i){
            String prevNode = nodeNames[i-1];
            String curNode = nodeNames[i];
            criteria.createAlias(prevNode+"."+curNode, curNode);
        }
    
        return criteria;
    }

Phil 12.28.15

7:00 – 5:00 VTX

  • Oliver, J. Eric, and Thomas J. Wood. “Conspiracy theories and the paranoid style (s) of mass opinion.” American Journal of Political Science 58.4 (2014): 952-966., and the Google Scholar page of papers that cite this. Looking for insight as to the items that make (a type of?) person believe false information.
  • This follows up on an On the Media show called To Your Health, that had two interesting stories: An interview with John Bohannon, who published the intentionally bad study on chocolate, and an interview with Taryn Harper Wright, a blogger who chases down cases of Munchausen by Internet, and says that excessive drama is a strong indicator of this kind of hoax.
  • Reading Social Media and Trust during the Gezi Protests in Turkey.
    • Qualitative study that proposes Social Trust and System Trust
      • Social Trust
      • System Trust
  • Hibernating Moderately
    • Working on the dictionary
    • Working on the Corpus
      • Name
      • Date created
      • Source URL
      • Raw Content
      • Cleaned Content
      • Importer
      • Word count
      • guid
    • I think I’ll need a table that has the word id that points to a corpus and gives the count of that word in that corpus. The table gets updated whenever a dictionary is run against a corpus. Since words are not shared between dictionaries (Java != Java), getting the corpus to dictionary relationship is straightforward if needed.
    • Created a GuidBase that handles the name, id, and guid code that’s shared across most of the items.
    • Discovered Jsoup, which has some nice (fast?) html parsing.
    • Finished most of Corpus. Need to add a join to users. Done
    • Added BaseDictionary.
    • Added BaseDictionaryEntry.
    • Working on getting a join table working that maps words to corpora and am getting a “WARN: SQL Error: 1062, SQLState: 23000”. I was thinking that I could create a new HashMap, but I think I may have to point to the list in a different way. Here’s the example from JustHibernate:
              session.beginTransaction();
              Showroom showroom = new Showroom();
              showroom.setLocation("East Croydon, Greater London");
              showroom.setManager("Barry Larry");
              Set cars = new HashSet();
              
              cars.add(new Car("Toyota", "Racing Green"));
              cars.add(new Car("Nissan", "White"));
              cars.add(new Car("BMW", "Black"));
              cars.add(new Car("Mercedes", "Silver"));
      
              showroom.setCars(cars);
              
              session.save(showroom);
              
              session.getTransaction().commit();
    • Where the Showroom class has the Cars Set annotation as follows:
       @OneToMany
          @JoinTable
          (name="SHOWROOM_CAR_SET_ANN_JOINTABLE",
           joinColumns = @JoinColumn(name="SHOWROOM_ID")
           )
          @Cascade(CascadeType.ALL)
          private Set cars = null;
      
    • Anyway, more tomorrow…
    • Start on queries that:
      • List networks for users
      • List dictionaries for users
      • List Corpora

Phil 12.22.15

VTX 7:00 – 6:00

  • Probabilistic Inference II
    • Assertion – Any variable in a graph is said by me to be independent of any other non-descendant, given its parents. All the causality flows through the parents.
    • A belief net or Bays net is *always* acyclic and directed.
    • Traverse the graph from the bottom up, so that no node depends on a node to its left in a list.
    • Generating the list:BayesNetFromData
    • When using the list, work from the top down in the list
    • Naive Bayesian inference
      • P(a|b)P(b) = P(a,b) = P(b|a)P(a)
      • P(a|b) = (P(b|a)P(a))/P(b) BayesChain
      • Can use Bayes to decide between models – Naive Bayesian Classification
      • Use the sum of the logs of the probabilities rather than the products because otherwise we run out of bits of precision
    • The right thing to do when you don’t know anything (just have symptoms)
  • Hibernate
    • Adding config.setProperty(“hbm2ddl.auto”, “update”); to the setup, so that tables can be rebuilt on demand. Nope, that didn’t work. Maybe I can’t split configuration between the config file and programmatic variables?
    • The only way that I was able to get this to work as an argument was to have a setupTables flag indicate which config to read. That works well though.
    • Got simple collections running, which means that I should be able to get networks built. Basically modified the example from Just Hibernate that starts on page 53.
    • Next, we work on getting inheritance to work. I think this will help.
  • Initial Java class network thoughts, just to try storing and retrieving items
    • BaseItem
      • guid
    •  BaseNode extends BaseItem
      • node_id
      • name
    • BaseEdge extends BaseItem
      • edge_id
      • source
      • target
      • weight
    • BaseNetwork extends BaseItem
      • network_id
      • name
      • owner
      • edgeList
      • nodeList (we need this because we may have orphans in the network)
    • BaseOwner extends BaseItem
      • owner_id
      • name
      • password?

Phil 12.21.15

8:00 – 6:00 VTX

  • No MIT video today. Went out and saw Star Wars. Fun! Need to see it again when the crowds thin out in an IMAX theater.
  • Copied some stunt data into the hibernate_test db.
  • Ran the code that set up the session and connected to the (empty) db. No exceptions, so I think it’s working this time…
  • IDE is tracking annotations. The names in the annotation class need to be the same as the table and element names or there is an error IntelliJ Hibernate Setup
  • Ok, reading and writing into the db. Now to clean it up and separate elements;
  • Here’s the current cleaned up version. Still need to create the table more properly.
    package com.philfeldman.mains;
    
    import com.philfeldman.mappings.Employee;
    import org.hibernate.HibernateException;
    import org.hibernate.Query;
    import org.hibernate.Session;
    import org.hibernate.SessionFactory;
    import org.hibernate.cfg.Configuration;
    import org.hibernate.metadata.ClassMetadata;
    import org.hibernate.service.ServiceRegistry;
    import org.hibernate.service.ServiceRegistryBuilder;
    
    import java.util.Map;
    import java.util.Random;
    
    /**
     * Created by philip.feldman on 12/21/2015.
     *
     * A simple test program that will read and write from a table in a database. In MySql, the
     * table is in the form:
             CREATE TABLE employee (
             id int(11) NOT NULL AUTO_INCREMENT,
             name varchar(50) DEFAULT NULL,
             PRIMARY KEY (id)
     ) ;
     */
    public class EmployeeTest {
        private SessionFactory sessionFactory;
        private ServiceRegistry serviceRegistry;
        private Session session;
        private Random rand;
    
        public EmployeeTest()throws ExceptionInInitializerError{
            this.rand = new Random();
            try {
                Configuration config = new Configuration();
                config.configure("hibernate.cfg.xml");
    
                this.serviceRegistry = new ServiceRegistryBuilder().applySettings(config.getProperties()).buildServiceRegistry();
                this.sessionFactory = config.buildSessionFactory(serviceRegistry);
                this.session = this.sessionFactory.openSession();
            } catch (Throwable ex) {
                throw new ExceptionInInitializerError(ex);
            }
        }
    
        public void closeSession(){
            this.session.close();
        }
    
        public void printAllEntityNames(){
            System.out.println("querying all the managed entities...");
            final Map metadataMap = this.session.getSessionFactory().getAllClassMetadata();
            System.out.println("There are [" + metadataMap.keySet().size() + "] members in the set");
            for (Object key : metadataMap.keySet()) {
                System.out.println("key = ["+key.toString()+"]");
            }
        }
    
        public void printAllEmployees(){
            String key = Employee.class.getCanonicalName();
            final Map metadataMap = this.session.getSessionFactory().getAllClassMetadata();
            final ClassMetadata classMetadata = (ClassMetadata) metadataMap.get(key);
            final String entityName = classMetadata.getEntityName();
            final Query query = session.createQuery("from " + entityName);
            System.out.println("executing: " + query.getQueryString());
            for (Object o : query.list()) {
                Employee e = (Employee) o;
                System.out.println("  " + e.toString());
            }
        }
    
        public void addRandomEmployee(){
            try {
                session.beginTransaction();
                Employee employee = new Employee();
                employee.setName("rand(" + this.rand.nextInt(100) + ")");
                session.save(employee);
                session.getTransaction().commit();
            }catch (HibernateException e){
                session.getTransaction().rollback();
            }
        }
    
        public static void main(String[] args){
            try {
                //System.out.println("Employee.class.getCanonicalName: "+Employee.class.getCanonicalName());
                /***/
                EmployeeTest et = new EmployeeTest();
                et.printAllEntityNames();
                et.printAllEmployees();
                et.addRandomEmployee();
                et.closeSession();
                 /***/
            }catch (Exception e){
                e.printStackTrace();
            }
    
        }
    }

Phil 12.16.15

7:00 – 9:00, 10:30 – 4:30 VTX

  • Since I’ll be missing the scrum today, sent Aaron an email with status. Which is basically until I know if we’re going to have a semantic network for our derived data, I don’t know how to do a taxonomy.
  • Got RabbitMQ running, following the Local RabbitMQ Setup in Confluence. To open a command prompt as full admin, you have to run it from the ‘start’ input field with Ctrl-shift-enter
  • Running the NLPService with errors. Doesn’t seem to be a permissions issue. Sent Balaji an email, but here are the errors for future reference:
    2015-12-16 08:25:24.449 ERROR 3588 --- [pool-8-thread-1] com.netflix.discovery.DiscoveryClient    : DiscoveryClient_NLPSERVICE/PFELDMAN-NCS - was unable to sen
     heartbeat!
    
    com.sun.jersey.api.client.ClientHandlerException: org.apache.http.conn.HttpHostConnectException: Connection to http://localhost:8761 refused
            at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
            at com.sun.jersey.api.client.filter.GZIPContentEncodingFilter.handle(GZIPContentEncodingFilter.java:120)
            at com.netflix.discovery.EurekaIdentityHeaderFilter.handle(EurekaIdentityHeaderFilter.java:28)
            at com.sun.jersey.api.client.Client.handle(Client.java:648)
            at com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
            at com.sun.jersey.api.client.WebResource.put(WebResource.java:211)
            at com.netflix.discovery.DiscoveryClient.makeRemoteCall(DiscoveryClient.java:1097)
            at com.netflix.discovery.DiscoveryClient.makeRemoteCall(DiscoveryClient.java:1060)
            at com.netflix.discovery.DiscoveryClient.access$500(DiscoveryClient.java:105)
            at com.netflix.discovery.DiscoveryClient$HeartbeatThread.run(DiscoveryClient.java:1583)
            at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
            at java.util.concurrent.FutureTask.run(Unknown Source)
            at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
            at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
            at java.lang.Thread.run(Unknown Source)
    Caused by: org.apache.http.conn.HttpHostConnectException: Connection to http://localhost:8761 refused
            at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:190)
            at org.apache.http.impl.conn.AbstractPoolEntry.open(AbstractPoolEntry.java:151)
            at org.apache.http.impl.conn.AbstractPooledConnAdapter.open(AbstractPooledConnAdapter.java:125)
            at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:640)
            at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:479)
            at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
            at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:827)
            at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
            ... 14 common frames omitted
    Caused by: java.net.ConnectException: Connection refused: connect
            at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
            at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source)
            at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
            at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
            at java.net.AbstractPlainSocketImpl.connect(Unknown Source)
            at java.net.PlainSocketImpl.connect(Unknown Source)
            at java.net.SocksSocketImpl.connect(Unknown Source)
            at java.net.Socket.connect(Unknown Source)
            at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127)
            at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180)
            ... 21 common frames omitted
    
    
    2015-12-16 08:25:26,620 ERROR [pool-9-thread-1] com.netflix.discovery.DiscoveryClient [nlp-service-local] Can't get a response from http://localhost:8761/eurek
    /apps/
    Can't contact any eureka nodes - possibly a security group issue?
    com.sun.jersey.api.client.ClientHandlerException: org.apache.http.conn.HttpHostConnectException: Connection to http://localhost:8761 refused
            at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184) ~[jersey-apache-client4-1.11.jar!/:1.11]
            at com.sun.jersey.api.client.filter.GZIPContentEncodingFilter.handle(GZIPContentEncodingFilter.java:120) ~[jersey-client-1.11.jar!/:1.11]
            at com.netflix.discovery.EurekaIdentityHeaderFilter.handle(EurekaIdentityHeaderFilter.java:28) ~[eureka-client-1.1.147.jar!/:1.1.147]
            at com.sun.jersey.api.client.Client.handle(Client.java:648) ~[jersey-client-1.11.jar!/:1.11]
            at com.sun.jersey.api.client.WebResource.handle(WebResource.java:670) ~[jersey-client-1.11.jar!/:1.11]
            at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74) ~[jersey-client-1.11.jar!/:1.11]
            at com.sun.jersey.api.client.WebResource$Builder.get(WebResource.java:503) ~[jersey-client-1.11.jar!/:1.11]
            at com.netflix.discovery.DiscoveryClient.getUrl(DiscoveryClient.java:1567) [eureka-client-1.1.147.jar!/:1.1.147]
            at com.netflix.discovery.DiscoveryClient.makeRemoteCall(DiscoveryClient.java:1113) [eureka-client-1.1.147.jar!/:1.1.147]
            at com.netflix.discovery.DiscoveryClient.makeRemoteCall(DiscoveryClient.java:1060) [eureka-client-1.1.147.jar!/:1.1.147]
            at com.netflix.discovery.DiscoveryClient.getAndStoreFullRegistry(DiscoveryClient.java:835) [eureka-client-1.1.147.jar!/:1.1.147]
            at com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:746) [eureka-client-1.1.147.jar!/:1.1.147]
            at com.netflix.discovery.DiscoveryClient.access$1400(DiscoveryClient.java:105) [eureka-client-1.1.147.jar!/:1.1.147]
            at com.netflix.discovery.DiscoveryClient$CacheRefreshThread.run(DiscoveryClient.java:1723) [eureka-client-1.1.147.jar!/:1.1.147]
            at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [na:1.8.0_66]
            at java.util.concurrent.FutureTask.run(Unknown Source) [na:1.8.0_66]
            at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [na:1.8.0_66]
            at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [na:1.8.0_66]
            at java.lang.Thread.run(Unknown Source) [na:1.8.0_66]
    Caused by: org.apache.http.conn.HttpHostConnectException: Connection to http://localhost:8761 refused
            at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:190) ~[httpclient-4.2.1.jar!/:4.2.1]
            at org.apache.http.impl.conn.AbstractPoolEntry.open(AbstractPoolEntry.java:151) ~[httpclient-4.2.1.jar!/:4.2.1]
            at org.apache.http.impl.conn.AbstractPooledConnAdapter.open(AbstractPooledConnAdapter.java:125) ~[httpclient-4.2.1.jar!/:4.2.1]
            at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:640) ~[httpclient-4.2.1.jar!/:4.2.1]
            at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:479) ~[httpclient-4.2.1.jar!/:4.2.1]
            at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906) ~[httpclient-4.2.1.jar!/:4.2.1]
            at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:827) ~[httpclient-4.2.1.jar!/:4.2.1]
            at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170) ~[jersey-apache-client4-1.11.jar!/:1.11]
            ... 18 common frames omitted
    Caused by: java.net.ConnectException: Connection refused: connect
            at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method) ~[na:1.8.0_66]
            at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source) ~[na:1.8.0_66]
            at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source) ~[na:1.8.0_66]
            at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source) ~[na:1.8.0_66]
            at java.net.AbstractPlainSocketImpl.connect(Unknown Source) ~[na:1.8.0_66]
            at java.net.PlainSocketImpl.connect(Unknown Source) ~[na:1.8.0_66]
            at java.net.SocksSocketImpl.connect(Unknown Source) ~[na:1.8.0_66]
            at java.net.Socket.connect(Unknown Source) ~[na:1.8.0_66]
            at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127) ~[httpclient-4.2.1.jar!/:4.2.1]
            at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180) ~[httpclient-4.2.1.jar!/:4.2.1]
            ... 25 common frames omitted
    
    2015-12-16 08:25:26.652 ERROR 3588 --- [pool-9-thread-1] com.netflix.discovery.DiscoveryClient    : DiscoveryClient_NLPSERVICE/PFELDMAN-NCS - was unable to ref
    esh its cache! status = org.apache.http.conn.HttpHostConnectException: Connection to http://localhost:8761 refused
    
    com.sun.jersey.api.client.ClientHandlerException: org.apache.http.conn.HttpHostConnectException: Connection to http://localhost:8761 refused
            at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:184)
            at com.sun.jersey.api.client.filter.GZIPContentEncodingFilter.handle(GZIPContentEncodingFilter.java:120)
            at com.netflix.discovery.EurekaIdentityHeaderFilter.handle(EurekaIdentityHeaderFilter.java:28)
            at com.sun.jersey.api.client.Client.handle(Client.java:648)
            at com.sun.jersey.api.client.WebResource.handle(WebResource.java:670)
            at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
            at com.sun.jersey.api.client.WebResource$Builder.get(WebResource.java:503)
            at com.netflix.discovery.DiscoveryClient.getUrl(DiscoveryClient.java:1567)
            at com.netflix.discovery.DiscoveryClient.makeRemoteCall(DiscoveryClient.java:1113)
            at com.netflix.discovery.DiscoveryClient.makeRemoteCall(DiscoveryClient.java:1060)
            at com.netflix.discovery.DiscoveryClient.getAndStoreFullRegistry(DiscoveryClient.java:835)
            at com.netflix.discovery.DiscoveryClient.fetchRegistry(DiscoveryClient.java:746)
            at com.netflix.discovery.DiscoveryClient.access$1400(DiscoveryClient.java:105)
            at com.netflix.discovery.DiscoveryClient$CacheRefreshThread.run(DiscoveryClient.java:1723)
            at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
            at java.util.concurrent.FutureTask.run(Unknown Source)
            at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
            at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
            at java.lang.Thread.run(Unknown Source)
    Caused by: org.apache.http.conn.HttpHostConnectException: Connection to http://localhost:8761 refused
            at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:190)
            at org.apache.http.impl.conn.AbstractPoolEntry.open(AbstractPoolEntry.java:151)
            at org.apache.http.impl.conn.AbstractPooledConnAdapter.open(AbstractPooledConnAdapter.java:125)
            at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:640)
            at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:479)
            at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
            at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:827)
            at com.sun.jersey.client.apache4.ApacheHttpClient4Handler.handle(ApacheHttpClient4Handler.java:170)
            ... 18 common frames omitted
    Caused by: java.net.ConnectException: Connection refused: connect
            at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
            at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source)
            at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
            at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
            at java.net.AbstractPlainSocketImpl.connect(Unknown Source)
            at java.net.PlainSocketImpl.connect(Unknown Source)
            at java.net.SocksSocketImpl.connect(Unknown Source)
            at java.net.Socket.connect(Unknown Source)
            at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:127)
            at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:180)
            ... 25 common frames omitted
  • It turns out that these errors are related to service registering with the discovery process. In local environment you don’t have a service registry running. You can run it if you want and that’s a different process(project). Irrespective of those service trying to register errors, the REST calls will still work.
    • The way to remove the service registration (remove the noise errors) in local env is, in src/main/resources/nlpservice-config.xml, change this
    • <serviceRegistry>http://localhost:8761/eureka/</serviceRegistry&gt;
    • to
    • <serviceRegistry>none</serviceRegistry>
    • Do the gradle build again and run it.
    • And a useful thread:
      Patakula, Balaji 11:44a 
      Hello Phil
      
      Me 11:44a 
      Hiya
      
      Patakula, Balaji 11:45a 
      the url should be localhost:8870/nlpservice/ner
      
      Patakula, Balaji 11:45a 
      with no double slash after the port
      
      Me 11:46a 
      localhost:8870/nlpservice/ner gives the same error in my setup
      
      Patakula, Balaji 11:46a 
      also the body should be like { "text": "my name is Phil Feldman."}
      
      Patakula, Balaji 11:46a 
      or any text that u want
      
      Me 11:47a 
      So it's not the JSON object on the NLPService page?
      
      
      Patakula, Balaji 11:48a 
      if u import the nlp.json into the postman
      
      Patakula, Balaji 11:48a 
      all the requests will be already there
      
      Me 11:49a 
      Import how?
      Patakula, Balaji 11:49a 
      there is an import menu on postman
      
      Me 11:50a 
      Looking for it...
      
      Patakula, Balaji 11:50a 
      the middle panel on the top black menu last item
      
      Me 11:50a 
      Got it.
      
      Patakula, Balaji 11:51a 
      just import that json downloaded from the wiki
      
      Patakula, Balaji 11:51a 
      and u should have the collection now in ostman 
      
      Patakula, Balaji 11:51a 
      postman 
      
      
      
      Patakula, Balaji 11:51a 
      and u can just click to send
      
      Me 11:52a 
      Added the file. Now what.
      
      Patakula, Balaji 11:52a 
      can u share the screen
      
      Me 11:52a 
      using what?
      
      Patakula, Balaji 11:52a 
      u have the nlp service running?
      
      Patakula, Balaji 11:53a 
      just in IM 
      
      Patakula, Balaji 11:53a 
      there is a present screen on the bottom of this chat
      
      Me 11:53a 
      nlp service is running. It extracted my entity as well. Now I'm curious about that bigger json file
      
      Patakula, Balaji 11:54a 
      that json file is just the REST calls that are supported by the service
      
      Patakula, Balaji 11:54a 
      it is just a way of documenting the REST 
      
      Patakula, Balaji 11:54a 
      so some one can just import the file and execute the commands
      
      Me 11:54a 
      So how does it get ingested?
      
      Patakula, Balaji 11:55a 
      which one?
      
      Me 11:55a 
      nlp.json
      
      Patakula, Balaji 11:55a 
      its not ingested. NLP is service. It gets the requests through Rabbit queue from the Crawler ( another service)
      
      Patakula, Balaji 11:56a 
      if u need to test the functionality of NLP, the way u can test and see the results is using the REST interface that we are doing now
      
      Me 11:56a 
      so nlp.json is a configuration file for postman?
      
      Patakula, Balaji 11:56a 
      thats right
      
      Me 11:57a 
      Ah. Not obvious.
      
      Patakula, Balaji 11:57a 
      Is Aaron sit next to you?
      
      Me 11:58a 
      No, he stepped out for a moment. He should be back in 30 minutes or so.
      
      Patakula, Balaji 11:58a 
      may be u can get the data flow from him and he knows how to work with all these tools
      
      Me 11:58a 
      Yeah, he introduced me to Postman.
      
      Me 11:58a 
      But he thought nlp.json was something to send to the NLPService.
      
      Patakula, Balaji 11:58a 
      may be he can give a brain dump of the stuff and how services interact, how data flows etc.,
      
      Me 11:59a 
      I'm starting to see how it works. Was not expecting to see Erlang.
      
      Me 12:00p 
      Can RabbitMQ coordinate services under development on my machine with services stood up on a test environment, such as AWS?
      
      Patakula, Balaji 12:01p 
      u can document all the REST calls that a service exposes by hand writing all those ...or just export the REST calls from postman and every one who wants to use the service can just import that json and work with the REST interface
      
      Me 12:01p 
      Got it.
      
      Patakula, Balaji 12:01p 
      RabbitMq is written in Erlang and we interface with it for messaging
      
      Patakula, Balaji 12:02p 
      yes, u can configure the routes to work that way
      
      Patakula, Balaji 12:02p 
      meaning mismatch services between different environments
      
      Me 12:02p 
      Yeah, I see that. Not that surprising that a communications manager would be written in Erlang. But still a rare thing to see.
      
      Me 12:03p 
      Is there a collection of services stood up that way for development?
      
      Patakula, Balaji 12:04p 
      u installed rabbit yesterday locally on ur machine
      
      Me 12:04p 
      Yes, otherwise none of this would be working?
      
      Patakula, Balaji 12:04p 
      so u can run various services now orchestrated through ur local rabbit
      
      
      
      Me 12:05p 
      Understood. Are there currently stood-up services that can be accessed on an ad-hoc basis, or would I need to do that?
      
      Patakula, Balaji 12:05p 
      Rabbit is only for streaming messages. Every service exposes both streaming ( Rabbitmq messages) and REST interfaces
      
      Patakula, Balaji 12:06p 
      there are no services stood up in adhoc env currently. There is a CI ,QA and Demo env
      
      Patakula, Balaji 12:06p 
      all those envs have all the services running
      
      Me 12:07p 
      What's Cl?
      
      Me 12:07p 
      I'd guess continuous integration, but it's ambiguous.
      
      Patakula, Balaji 12:07p 
      continuous integration. Every code checkin automatically builds the system, runs the tests, creates docker images and deploys those services and starts them 
      
      Me 12:08p 
      Can these CI services be pinged directly?
      
      Patakula, Balaji 12:08p 
      ye
      
      Patakula, Balaji 12:08p 
      yes
      
      Me 12:09p 
      Do you need to be on the VPN?
      
      Patakula, Balaji 12:09p 
      http://dockerapps.philfeldman.com:8763/ <http://dockerapps.philfeldman.com:8763/>  
      
      Patakula, Balaji 12:09p 
      those are the services running
      
      Patakula, Balaji 12:09p 
      and dockerapps is the host machine for CI
      
      Me 12:09p 
      And how do I access the NLPService on dockerapps?
      
      Patakula, Balaji 12:10p 
      access meaning? u want t send the REST requests to CI service?
      
      Me 12:10p 
      Yeah. Bad form?
      
      Patakula, Balaji 12:11p 
      just in the REST, change the localhost to dockerapps.philfeldman.com
      
      Me 12:12p 
      I get a 'Could not get any response'
      
      Me 12:12p 
      dockerapps.philfeldman.com:8870/nlpservice/ner
      
      Patakula, Balaji 12:12p 
      sorry, NLP is running on a different host 10.18.7.177
      
  • Learning about RabbitMQ
  • Installing the google chrome Postman plugin
    • Set the POST option
    • Set RAW
    • Header to Content-Type
    • Value to application/json
    • URL is localhost:8870//nlpservice/ner
    • place the JSON in the ‘Body’ tag

Phil 11.25.15

7:00 – 1:00 Leave

  • Constraints: Search, Domain Reduction
    • Order from most constrained to least.
    • For a constrained problem, check over and under allocations to see where the gap between fast failure and fast completion lie.
    • Only recurse through neighbors where domain (choices) have been reduced to 1.
  • Dictionary
    • Add an optional ‘source_text’ field to the tn_dictionaries table so that user added words can be compared to the text. Done. There is the issue that the dictionary could be used against a different corpus, at which point this would be little more than a creation artifact
    • Add a ‘source_count’ to the tn_dictionary_entries table that is shown in the directive. Defaults to zero? Done. Same issue as above, when compared to a new corpus, do we recompute the counts?
    • Wire up Attach Dictionary to Network
      • Working on AlchemyDictReflect that will place keywords in the tn_items table and connect them in the tn_associations table.
      • Had to add a few helper methods in networkDbIo.php to handle the modifying of the network tables, since alchemyNLPbase doesn’t extend baseBdIo. Not the cleanest thing I’ve ever done, but not *horrible*.
      • Done and working! Need to deploy.

Phil 8.20.15

8:30 – 4:30 SR

  • Working on getting admin on the servers. Then my dev box.
  • Only clear the data item list when a new query is submitted. Adding an item to the network should show up in the network. Any selected items appear at the head of the itemList, above the previous search results. This way, more than one item can be added from the same search.
  • Items are added to the network only from the response from the server. The query should also send up a list of item ids and association ids that it already has, so only new items are returned. Ok, that’s mostly done, though at this point the full network is returned.
  • Network file operations need
    • A default state (id == -1?). This is an interesting problem. Since the system needs the server and the DB to build the connections in the network, we have to have a provisional network. I think I’m going to do this by creating networks where the ‘new’ network is created as ‘provisional’. If the user exits without saving and then starts a new session, any prior provisional network is deleted.
    • Merge With – adds the current network to the selected network and then downloads the while thing
    • Load – deletes anything current
    • Save –  save as current
  • I think I have this now working by saving the network the moment that the app comes up as not archive. The id is then used until another network is loaded (then all not archive are deleted for the user anyway. If the user explicitly saves the new network, then it’s set to archive.

Phil 7.23.15

9:00 – 5:00 SR

  • First, I’m going to check to see if I can pull in and display an entire webpage with ng-sanitize
  • Had to get a cert for my dev machine php install to get curl to be able to pull https content. Here’s the relevant info from the php.net post
Please everyone, stop setting CURLOPT_SSL_VERIFYPEER to false or 0. If your PHP installation doesn't have an up-to-date CA root certificate bundle, download the one at the curl website and save it on your server:

http://curl.haxx.se/docs/caextract.html

Then set a path to it in your php.ini file, e.g. on Windows:

curl.cainfo=c:\php\cacert.pem

Turning off CURLOPT_SSL_VERIFYPEER allows man in the middle (MITM) attacks, which you don't want!
  • Adding the ability to open full pages. It’s now working (not for all pages, will need to finesse that), but I had a few moments where Chrome would NOT LET GO of its cache. Sheesh.
  • Loading the html in the PHP and sending it back as content didn’t work. The trick is to open the page in a frame directly (and save the link) Based on the stackoverflow staring point.
      • Use the $sce service component from ngSanitize and inject in the main module:
        this.appModule.directive('ngFeedPanel', ['$timeout','$rootScope', '$sce', queryDirectivePtr]);
      • It gets incorporated in the directive so:
        public ctor(timeout:ng.ITimeoutService, rootscope:ng.IScope, sce:ng.ISCEService):ng.IDirective {
            this.sceProvider = sce;
            // other stuff goes here
        }
      • That in turn gets called in the html like this:
        
        
  • Lastly, getLink() in the directive looks like:
    scope.getLink = ():void => {
        var mobj:RssControllersModule.IDataResponse = scope.messageObj;
        return this.sceProvider.trustAsResourceUrl(mobj.link);
    };

Dong Shin 07.10.2015

  • working on documentation
  • added SpringLibs java project to contain all the Spring and other libraries used for FinancialAssistantService, ReqoncilerService, StoredQueryService so that the projects don’t depend on any external libraries
  • SpringLibs – /trunk/Sandbox_folders/DONG_SANDBOX/Java/SpringLibs

Dong Shin 06.24.2015

  • working on lab locations data
    • should work with FA data
    • moved the project_lab_locations table to project_portfolio_enh database
    • updated test projects to match new locations
    • simple SQL to find lat/long of labs specified in budget_center
      • SELECT * FROM budget_centers bc LEFT JOIN lab_locations ll ON bc.lab = ll.name
  • started charts typescript project