WORK EXPERIENCE Consultant and Full-Stack Engineer at Data Conversion Laboratories, 11/2016-present, Fresh Meadows, NY • Scrum Web Portal: Create and edit objectives and progress thereof. Currently migrating portal to Angular 2/4 via test-driven development(TDD). MongoDB 3.2.9, Express.js 4.14.0, Angular 1.4.12, Node.js 6.6.0 • Conversions of XML documents via GATE (General Architecture for Text Engineering) NLP (Natural Language Processing) engine`s Java API and Perl regular expressions and libxml. Java SE 1.8.0, NetBeans 8.2, Perl 5.16.2 Full Stack Engineer at P.R. Newswire, New York, NY 7/2013-8/2016 Singlehandedly built-out 2 web-based data portals from scratch with corresponding web-scraping daemons. • Web Portal 2: Angular.js-based single page application (SPA) to view, filter, sort, and update filings information for over 1 million filings web-scraped by my C++ crawler. Export to CSV file. Bootstrap for responsive design. Implemented RESTful Web API with Node.js (JSON). Technologies: Angular 1.4, Node.js 0.12 (tedious-ntlm), Express.js 4.12, Bootstrap 3.3, gulp, npm, bower, stylus, Python, Microsoft SQL Server 2012 • Web Portal 1: HTML/CSS/JavaScript/jQuery web front-end to view, filter, sort filings information web-scraped by my C++ crawler. Expando-tables of nested drill-down data. Export to CSV file. Technologies: Apache 2.2, MySql 5.5, PHP 5.4, XML, XSLT, Perl Symfony2.4.2 back end, including Symfony’s Streamed Response class. • Web Crawler Daemon For Web Portal 2: C++ (curl lib, ADO, Google Test, STL) multi-threaded background process to download filings information, parse headers on-the-fly via finite-state machine, and load into Microsoft SQL Server via T-SQL stored procedures, which normalize the data. Design Patterns: Observer, Decorator. • Web Crawler Daemon For Web Portal 1: C++ (boost, curl lib, MySQL lib, GoogleTest, STL) background process downloads XML filings information, parses into MySQL tables via finite-state machine and the “State†Design Pattern (about 100 different states), and normalizes data. Round-trip XML to MySQL to XML unit test to ensure parsing integrity. Independent Consultant 10/2006-6/2013 Programmer at Jefferies & Company, Jersey City, NJ 9/2003-7/2006 • Asset Managment Portal: Architected and developed Perl-based straight-through processing system that automatically downloaded and reconciled 7 different commodities portfolios. Created PHP-based web interface. PHP, HTML, CSS, JavaScript, Perl, Shell Scripts, and SYBASE on Solaris. EDUCATION RUTGERS UNIVERSITY, New Brunswick, NJ Bachelors in Computer Science Bachelors in Physics
©