News Focus
News Focus
Followers 387
Posts 16955
Boards Moderated 0
Alias Born 10/23/2007

Re: None

Sunday, 05/13/2018 6:17:30 AM

Sunday, May 13, 2018 6:17:30 AM

Post# of 143961
David Callaghan just came from the company. Perficient, PRFT, Nasdaq $24.65/share
David Callaghan on LinkedIn.......info below


500+ connections
I bring twenty years of development experience and I'm currently hands-on with Hadoop/Spark, blockchain and cloud, coding in Java, Scala and Go. I'm certified in and work extensively with Hadoop, Cassandra, Spark, AWS, MongoDB and Pentaho. Most recently, I've been bringing integrated blockch...

David’s Articles & Activity
8,725 followers

I took a deep dive into the consensus algorithms for fail-stop and byzantine fault tolerance.
David shared this

Got my ScrumAlliance Certified ScrumMaster® certification
David shared this

I dive into the Advanced Security model of the new version of DataStax
David shared this

Perficient
Big Data Strategist and Solutions Architect
Company NamePerficient
Dates EmployedJun 2017 – Present Employment Duration1 yr
LocationCharlotte, North Carolina Area
DataStax Evangelist

Publix: Built out new data platform on Azure using CosmosDB to power web and mobile traffic, Spark to enable machine learning on these datasets to provide recommendation engine and other functionality and HDInsight to provide for Customer360 analytics.

Volkswagen: Built out an AWS IoT platform to manage electric car refueling stations across the US.

Ford Motor Company: Build a full DataStax back-end (Cassandra, Spark and Solr) on Azure to support a Facebook AI chatbot for customer engagement.

2C Data
Big Data & Blockchain Developer
Company Name2C Data

Dates EmployedApr 2013 – Present Employment Duration 5 yrs 2 mos
LocationCharlotte, NC
Responsible for developing and implementing Big Data solutions. Current projects include:
Orphans In The Desert
Cluster sufferers of distinct rare diseases into groups large enough and with a sufficiently well-defined set of symptoms and markers to allow for pharmaceutical companies to develop drugs and for health care providers to develop protocols for treatment using IBM Watson, Cloudant and Bluemix.

Professional Investors Network
A financial services marketing company specializing in building real estate investment entrepreneurs through training, networking and access to financing.
• Educational element include articles and training material, including video, is stored in MongoDB and searchable using Lucene/Solr. Document analysis is done in MapReduce with Hive.
Business Layer DSLs developed in Scala, back-end development in Java, financial analysis algorithms coded in R.

QUIOBO.CO
Big (Fast) Data PAAS for Communication Service Providers (CSPs) operating in Colombia, SA using AWS.
• Store Call Detail (xDRs and CDRs), network and customer data using Flume into Cassandra and provide Business Intelligence using MapReduce and custom and Mahoot-based algorithms for
fraud detection (subscription and superimposition fraud), customer profiling, network fault isolation
• Business Intelligence reporting using Tableau, querying available through Hive
Media (4)This position has 4 media
Previous Next
Apache Atlas Preview
Apache Atlas Preview
This media is a document
Orphans in the Desert
Orphans in the Desert
This media is a document
Big Data in Telecommunications: A Practical Roadmap for the Colombian CSP
Big Data in Telecommunications: A Practical Roadmap for the Colombian CSP
This media is a document
Segment Of One
Segment Of One
This media is a document
Bank of America
Hadoop / Spark Developer
Company NameBank of America
Dates EmployedFeb 2016 – May 2017 Employment Duration1 yr 4 mos
LocationCharlotte, North Carolina Area
Responsible for developing and implementing an Instrumentation as a Service platform processing 10K records per second from 1K applications using 30+ analytic models. Business partners are able to send logging and other asset monitoring data to the system in json as a file, through a web service or directly to Kafka. This data is processed through our lambda architecture with HBase as the speed layer and both Hive and Impala (using Parquet) as the data layer. Kamanja servers as the real-time decisioning support engine, implementing the models on data at rest and in real time.

Introduced and implemented a POC for using blockchain for data governance using Hyperledger for the chaincode and HBase for the off-blockchain storage, extending the use of HBase's cell-level tagging to implement an eventual trust pattern.

Completed POCs to replace our current system with a home-grown Spark implementation built with Scalding using the Akka/Play framework. Additional POCs involved Cassandra for faster ingestion and easier reporting access to the real time layer as well as MemSQL to provide a single data repository for speed and batch. I've also implemented potential alternatives to the enterprise platforms using Docker, Kubernetes and Mesos. Worked with the process of evaluating and onboarding FinTech companies.

Using Spring Boot and MongoDB, I built an early detection and response system that monitored Zookeeper, Kafka, HDFS, HBase, Hive, Spark and Kamanja. These microservices formed a unified platform called Level Zero to detect and resolve issues before they were raised by Level One support, substantially lowering the cost of ownership for these systems.
Sparks Ignite, Inc.
Bringing Big Data and Blockchain to the Cloud
Company NameSparks Ignite, Inc.
Dates EmployedMay 2015 – May 2016 Employment Duration1 yr 1 mo
LocationCharlotte, North Carolina Area
Responsible for helping companies successfully implement and realize measurable value from their Big Data, Fast Data and NoSQL implementations.
Media (1)This position has 1 media
IoT underthe hood
IoT underthe hood
This media is a document
Wells Fargo
Big Data Developer
Company NameWells Fargo
Dates EmployedJul 2014 – May 2015 Employment Duration11 mos
LocationCharlotte, North Carolina Area
Responsible for developing and implementing Big Data solutions.
• Built Regulatory Compliance engine using Kafka, Storm, Drools and Hive to analyze MS Exchange emails. Developed analytic algorithms in R and Mahout as well as text analytic tools using Lucene and Pig.
• Built Flume to HBase workflow for Lending Grid to increase their storage capacity and provided Phoenix to integrate into existing Apache Camel applications.
• Developed solutions for new regulatory reporting requirements for Treasury Analytics using Hbase, Hive and SAS.
• Introduced standards and practices at the development (TDD) and operations (Continuous Deploy) level.
Media (1)This position has 1 media
Big Fast Data
Big Fast Data
This media is a document
Bank of America
Big Data Developer
Company NameBank of America
Dates EmployedFeb 2014 – Jul 2014 Employment Duration6 mos
LocationCharlotte, North Carolina Area
Responsible for developing applications in Java using Hadoop, Hive and Hbase to support the Intrusion Detection Team.
• Built secure Java platform on JAAS for Kerberos authentication and authorization at the job level for MapReduce applications
• Implemented bi-directional integration of Hive, HBase and Giraph
• Developer native Java wrapper to implement existing R algorithms in Map Reduce
• Introduced standards and practices at the development (TDD) and operations (Continuous Deploy) level
Media (1)This position has 1 media
Big Fast Data - Enabling Perishable Insight at Scale
Big Fast Data - Enabling Perishable Insight at Scale
This media is a document
Duke Energy
Architect / Senior Java Developer
Company NameDuke Energy
Dates EmployedMar 2007 – Mar 2013 Employment Duration6 yrs 1 mo
Location400 South Tryon St Charlotte, NC
Senior Java Developer/Architect responsible for evaluating/implementing enterprise-wide technology initiatives including Big Data (Hadoop, HBase and Cassandra), Enterprise Application Integration (Spring Integration), rules engine (Drools) and a standard Java framework using Spring, Hibernate with Maven and Jenkins.

Smart Grid
• Architected Initial phase of Smart Grid implementation targeting 1.5M smart meters using Oracle Meter Data Management 2.0
• Developed Java app using Spring Integration to integrate Oracle MDM 2.0 w/ proprietary customer billing solution.
• Prototyped Hadoop map-reduce algorithms and Hive for precalculating data before MDM import.

Smart Energy
• Developed a Java framework for multiple ongoing prototypes of new and experimental clean energy technologies including electric vehicles and smart meters.
• Prototyped Cassandra solution for producing social graph from smart meter data (smartenergycharlotte.com/how-are-we-doing)

Pole Inspection
• Developed Java app to dynamically create Junit code and xml files from a Drools rules engine in order to test 70K+ inspection use cases.
Work Management
• Developed web-based call center mgmt app using Drools for complex rule management.

Gas Odor Correlation
• Developed web-based Java app for real-time analysis of data extracted from call center operations, pipeline structural data, population statistics and maps to predict potential gas leaks in densely populated areas for crew dispatch prioritization using user-configurable spherical trigonometry equations.

Mobile Work Management System
• Developed SOA system for taking messages from various new and legacy apps across companies from MQ using jms then interpreting/routing the work request to the appropriate system.

Enterprise Solution Architecture Framework
• Prototyped smart-phone and tablet web interface using JSF 2.0 and modifications to enterprise CSS.
• Prototyped Android application for scanning smart meter tags.
Media (2)This position has 2 media
Smart Energy Now | How Are We Doing?
Smart Energy Now | How Are We Doing?
This media is a link
Smart Grids and Big Data
Smart Grids and Big Data
This media is a document
New York Board of Trade
Senior Java Developer
Company NameNew York Board of Trade
Dates EmployedJun 2005 – Jan 2007 Employment Duration1 yr 8 mos
LocationGreater New York City Area
Developed Market Surveillance applications in Java as well as an electronic trading platform NYBOT needed to move from open outcry floor trading, which it used exclusively since 1870, to an electronic platform as a result of relocation to NYMEX after 9/11.

Electronic Trading
• Developed Struts/Hibernate web application front end for external trading partners and Swing desktop application for floor traders.
• Developed the model objects that represented standard commodity contracts as well as esoteric derivative instruments like crack spreads.
• Developed secure account creation and management and secure transaction management including auditing and nonrepudiation components.
Compliance Applications
• Developed market, position and financial surveillance desktop apps in Swing.
Li & Fung USA
Senior C# Developer
Company NameLi & Fung USA
Dates EmployedJun 2003 – May 2005 Employment Duration2 yrs
LocationGreater New York City Area
C# and BizTalk developer responsible for developing international logistics management system as well as developing a business process integration system between New York, Istanbul and Hong Kong offices with a nine member team.
Logistics Management System
• Developed a logistics management system in C# to handle the movement and storage of goods from the point of origin, typically southeast Asia, to major market retailers.
• Developed a business process integration system using BizTalk and Sterling Gentran to integrate EDI transmissions with the logistics management system.
• Developed supply chain management intelligence system using Cognos’ OLAP and BI.
? Won 2005 Wired 40 Award
SkillSoft
Senior Test Analyst
Company NameSkillSoft
Dates EmployedOct 2002 – Apr 2003 Employment Duration7 mos
LocationGreater Chicago Area
I worked for Thomson/NetG (now Skillsoft) putting together a unified testing system for compliance with international security, privacy and accessibility regulations.
• Created a unified system of software testing protocols using TestDirector, Mercury Load Runner and RequisitePro to measure compliance in these areas:
o accessibility testing procedures for Section 508 (US), Web Content Accessibility Guidelines (EU) compliance
o USA Government Information Security Reform Act, UK Data Protection Act
o P3P implementation and Privacy Policy compliance
o Security Testing, Forensics w/ ISS RealSecure & various Unix tools on Sun, BSD for OWASP compliance
The Chiva Group
Senior PHP Developer
Company NameThe Chiva Group
Dates EmployedMar 2001 – Sep 2002 Employment Duration1 yr 7 mos
LocationGreater Chicago Area
Owner and CTO marketing my own web-based platform for the Spanish-language call-center management of satellite services and managed a five member PHP team in Bogota, Colombia.
Dell Services
Senior Java and PHP Developer
Company NameDell Services
Dates EmployedMay 2000 – Feb 2001 Employment Duration10 mos
LocationGreater New York City Area
Lead developer of 15-member team to build first international trade finance auction site.
• Developed a secure, internationalized web-based auction site using PHP.
• Developed the back end processes to accurately manage negotiable paper such as bills of exchange, promissory notes and other securities for trade in Java using Oracle.
• Developed the secure exchange with Deutsche Bank using Java web service.
• Assisted Deutsche bank in implementing the receiving web services.
? Won Global Finance's Best Trade Finance Software (2001)
Volume:
Day Range:
Bid:
Ask:
Last Trade Time:
Total Trades:
  • 1D
  • 1M
  • 3M
  • 6M
  • 1Y
  • 5Y