Sunday, May 29, 2016

Deep Learning tools such as Caffe, Torch & Graph analytics tools: GraphX, Apache Giraph

Python, R, SAS, Hadoop, Machine Learning, SQL, Natural Language Processing (NLP), NLP tools (such as NLTK), Apache Lucene, Solr, Graph analytics tools: GraphX, Apache Giraph, Deep Learning tools such as Caffe, Torch

Located in Chicago, IL we are searching for a Lead Data Scientist with NPL, NLTK, Apache Lucene, Solr, Data sets experience with their Master's Degree to add to our growing team. We are the leading company in our industry and we need someone to work with data-sets of varying degrees of size and complexity including both structured and unstructured data. Are you local to Chicago? Are you looking to make an impact on a growing team for an industry leading company? Please apply today!

Top Reasons to Work with Us

1.) Full paid benefits
2.) Leading company in the healthcare industry

What You Will Be Doing

Primary duties may include are but not limited to: Design machine learning projects to address specific business problems determined by consultation with business partners. Piping and processing massive data-streams in distributed computing environments such as Hadoop to facilitate analysis. Implements batch and real-time model scoring to drive actions. Develops proprietary machine learning algorithms to build customized solutions that go beyond standard industry tools and lead to innovative solutions. Develop sophisticated visualization of analysis output for business users. Publish results and address constraints/limitations with business partners. Provides high-level controllership/evaluation of all output produced to ensure established targets are met. Determines the continuous improvement opportunities of current predictive modeling algorithms.

What You Need for this Position

*Advanced expertise with software such as Python, R, SAS. Programming experience in Python is strongly preferred.
*Experience working with distributed computing environment such as Hadoop.
*Intermediate to Advanced knowledge of using with Hive, Impala or Apache Spark.
*Intermediate to Advanced knowledge of data extraction and manipulation using SQL.
* Master's or PhD in Statistics, Computer Science, Mathematics, Machine Learning, Econometrics, Physics, Biostatistics or related
* 2+ year's experience utilizing NLP applications such as topic models and sentiment analysis to identify patterns within data sets is strongly preferred.
* Experience using open source NLP tools (such as NLTK), Apache Lucene, Solr, etc. is preferred.
* 3-5 years in Predictive Analytics

Hadoop Administrator 3 - Health IT

Job Description


Northrop Grumman's Technology Services is seeking a Hadoop Administrator with strong leadership and technical capabilities to join our team of qualified, diverse individuals. This position will be located in Woodlawn, MD. Look to a future of excellence by joining a Northrop Grumman team delivering cutting edge technology solutions to our clients. The qualified applicant will become part of Northrop Grumman's Health Solutions Management division which focuses on healthcare IT solutions for our Federal, state, and local government clients. Roles and Responsibilities


* Responsible to install and configure Hadoop components.


* Responsible to administer the Hadoop system for peak performance


* Develop reporting metrics to measure performance system


* While the candidate will be working in a development environment using an Agile development methodology, the candidate must be familiar with the full software lifecycle, including requirements engineering, system design, software development, unit and integration testing, formal software testing procedures, configuration management, and quality assurance techniques.


* Interface with program leadership to accomplish program objectives and identify process improvements.


* Participate and support the team in design Hadoop solutions.


* Develop and own SOPs and other technical documentation for Program's Process Asset Library.

Qualifications


Basic Qualifications


* Bachelor's Degree in Communication Engineering, Computer Engineering, Computer Science, Electrical Engineering, Information Systems, Mathematics, Systems Engineering, or similar degree plus 6 years experience


* Hadoop Administration


* Knowledge of Ambari, Hive, HBase, HCatalog, Sqoop, Pig, Loom, Knox, Ranger and Spark


* Understanding of Data Warehousing


* Ability perform root cause analysis and control gap identification


* 5+ years of experience related to BI tools, Analytics, healthcare data.


* 2+ years of Hadoop platform administration


* Ability to assess possible architectural limitations or shortcomings in such areas as scaling, speed, and throughput


* Developing system performance, availability, scalability, manageability, and security requirements for mid-to-large scale programs


* Experience in architecture definition and design of solutions incorporating Hadoop and MapReduce frameworks


* Experience with Hive Preferred Qualifications


* Excellent verbal and written communications skills are highly desired


* Developing solutions integrating and extending COTS products


* Integrating COTS and GOTS products from multiple vendors Northrop Grumman is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action Employer, making decisions without regard to race, color, religion, creed, sex, sexual orientation, gender identity, marital status, national origin, age, veteran status, disability, or any other protected class. For our complete EEO/AA and Pay Transparency statement, please visit www.northropgrumman.com/EEO. U.S. Citizenship is required for most positions. HEALTHIT  

Hadoop Infrastructure Engineer - Foundational Services


Hadoop Infrastructure Engineer - Foundational Services

company bannerAnalysis, Automated, Business Intelligence, Configuration Management, Data Analysis, Database, Developer, Development, Hadoop, Hardware, Java, Linux, Management, Networking, Python, Ruby, Security, Shell Scripting, Testing, Unix
Full Time
Telecommuting not available Travel not required

Job Description


Job Requisition Number: 49874




Many of the most prevalent applications in Bloomberg need real-time data analysis and business intelligence. Instead of using off-the-shelf applications, we use Hadoop and it's ecosystem to provide large-scale data platform with low-latency SLAs and large storage capabilities. This platform has revolutionized the way we manage and analyze data in a distributed environment.


We design software and hardware systems to support low-latency/high-volume requests, security, fault tolerance/high availability and easy customization. Our Hadoop Infrastructure Platform is built to fully automate deployment and operations using Chef; developed and open sourced at https://github.com/bloomberg/chef-bach . With hundreds of applications depending on our platform, we are looking to grow our Hadoop Infrastructure team. That's where you come in.


We'll trust you to:



  • Evaluate Hadoop projects across the ecosystem and extend and deploy them to exacting standards (high availability, big data clusters, elastic load tolerance)
  • Develop automation, installation and monitoring of Hadoop ecosystem components in our open source infrastructure stack, specifically HBase, HDFS, Map/Reduce, Yarn, Oozie, Pig, Hive, Tez, Spark and Kafka
  • Dig deep into performance, scalability, capacity and reliability problems to resolve them
  • Create application patterns for integrating internal application teams and external vendors into our infrastructure
  • Troubleshoot and debug Hadoop ecosystem run-time issues
  • Provide developer and operations documentation to educate peer teams


You'll need to have most of the following:




  • Experience building out and scaling a Hadoop-based or UNIX-hosted database infrastructure for an enterprise
  • 2+ years of experience with Hadoop infrastructure or a strong and diverse background of distributed cluster management and operations experience
  • Experience writing software in a continuous build and automated deployment environment


We'd love to see:




  • 2+ years of DevOps or System Administration experience using Chef/Puppet/Ansible for system configuration, or quality shell scripting for systems management (error handling, idempotency, configuration management)
  • In-depth knowledge of low-level Linux, UNIX networking and C system calls for high performance computing
  • Experience with Java, Python or Ruby development (including testing with standard test frameworks and dependency management systems, knowledge of Java garbage collection fundamentals)
  • Experience or exposure to the open source community (a well-curated blog, upstream accepted contribution or community presence)



We want to work with others who are passionate about community-driven development both within the company and with the wider open source community. If this sounds like you, submit an application, and learn more about the work we do from Clay and Amit's interview at ChefConf2015: https://www.youtube.com/watch?v=LnMCFxXgDE w




Date: Wed, 25 05 2016 00:00:00 GMT
Department: Software Developer/Engineering


Company Information

Bloomberg is a company dedicated to helping solve complex challenges through insight and information. Our strength—quickly and accurately delivering data, news and analytics through technology—is at the core of everything we do. With over 15,500 employees in 192 locations, we give influential decision makers in business, finance and government a competitive edge by connecting them to a dynamic network of news, people and ideas. To do so, we need constant energy and innovation—which is where you come in. At Bloomberg, you will have the opportunity to take risks and be part of an organization that is entering new markets, launching new ventures and pushing the boundaries. Our ever-expanding technology, data, news and media services afford employees the opportunity to expand skills and connect with smart, driven colleagues from a diversity of backgrounds and ideas. We're looking for dynamic, multi-talented people who have a desire to thrive in a forward-thinking culture and a business with global impact. Are you ready to make your mark? Learn more about our businesses and opportunities at bloomberg.com/careers

Friday, May 27, 2016

The Art of Game Design, 2nd Edition

The Art of Game Design, 2nd Edition

Overview


Good game design happens when you view your game from as many perspectives as possible. Written by one of the world's top game designers, The Art of Game Design presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, visual design, film, software engineering, theme park design, mathematics, puzzle design, and anthropology. This Second Edition of a Game Developer Front Line Award winner:

  • Describes the deepest and most fundamental principles of game design
  • Demonstrates how tactics used in board, card, and athletic games also work in top-quality video games
  • Contains valuable insight from Jesse Schell, the former chair of the International Game Developers Association and award-winning designer of Disney online games
The Art of Game Design, Second Edition gives readers useful perspectives on how to make better game designs faster. It provides practical instruction on creating world-class games that will be played again and again.

Book: http://www.amazon.com/Art-Game-Design-Lenses-Second/dp/1466598646/ref=as_li_tf_tl?tag=intell0b-20

Library: http://proquestpubliclibrary.safaribooksonline.com.ezproxy.nvcl.ca/book/programming/game-programming/9781466598645

 



Nmap Scripting Engine

Nmap Scripting Engine

Nmap is a well-known security tool used by penetration testers and system administrators for many different networking tasks. The Nmap Scripting Engine (NSE) was introduced during Google's Summer of Code 2006 and has added the ability to perform additional tasks on target hosts, such as advanced fingerprinting and service discovery and information gathering [xyz1].

This book will teach you everything you need to know to master the art of developing NSE scripts. The book starts by covering the fundamental concepts of Lua programming and reviews the syntax and structure of NSE scripts. After that, it covers the most important features of NSE. It jumps right into coding practical scripts and explains how to use the Nmap API and the available NSE libraries to produce robust scripts. Finally, the book covers output formatting, string handling, network I/O, parallelism, and vulnerability exploitation
[xyz1].

[xyz1] Mastering the Nmap Scripting Engine, by: Paulino Calderón Pale, 2015.

Monday, May 23, 2016

Senior Platform Engineer

Senior Platform Engineer

  • Palo Alto, CA
  • Full-time

Job Description

Atlassian is looking for experienced and talented Platform Engineers to join our new Platform Developent team. This role involves the design, implementation and maintenance of a new hosted platform. Be a part of an energetic and fast moving team that delivers incredible, innovative improvements to our products. You’ll collaborate often with other developers to write the best code for the project and deliver amazing results that our users love.

MORE ABOUT YOU

You confidently design and implement high-performance RESTful micro-services serving millions of requests a day. You are fluent in any modern object oriented programming language like Java, Scala, Python, Javascript, etc. Open to collaboration, you thrive on innovation and solving complex issues. You have broad knowledge and understanding of SaaS, PaaS, IaaS industry with hands on experience of public cloud offerings (AWS, GAE, Azure). With over 5+ years in a similar development role, you are able to tackle large problems whilst maintaining a relaxed yet enthusiastic attitude. Knowledge of open source technologies in the PaaS/IaaS space is a big bonus. Experience working with agile software development methodologies like XP and Scrum, continuous delivery and infrastructure as code would be a big plus as well.


MORE ABOUT OUR TEAM
We are always growing, learning, and adapting, in and out of the office. You’ll be joining a team that is crazy smart and very direct. We ask hard questions and challenge each other to constantly improve our work. We are self-driven but team oriented. We're dedicated to agile methodology and big believers in 'lean' (which means we don’t do documentation for documentation's sake). We know the importance of validating our assumptions about users and implement various types of testing to prove ourselves right (or wrong). Our bottom line is improving our user’s experience- no matter what.

Additional Information

All your information will be kept confidential according to EEO guidelines.

Ref: https://www.smartrecruiters.com/Atlassian 

Twisted (Python/Twisted)

Twisted (Python/Twisted)

Twisted is an event-driven network programming framework written in Python and licensed under the MIT License. Twisted projects variously support TCP, UDP, SSL/TLS, IP multicast, Unix domain sockets, a large number of protocols (including HTTP, XMPP, NNTP, IMAP, SSH, IRC, FTP, and others), and much more. Twisted is based on the event-driven programming paradigm, which means that users of Twisted write short callbacks which are called by the framework. Among many different aspects of Twisted, it should be noted that Twisted includes so many of protocol implementations like HTTP, FTP, SMTP, POP3, IMAP4, DNS, IRC, MSN, OSCAR, XMPP/Jabber, Telnet, SSH, SSL, NNTP, Finger, Ident, DJB's netstrings, simple line-oriented protocols, Perspective Broker (PB) and Asynchronous Messaging Protocol (AMP). 

Sunday, May 8, 2016

Progress Bar of HTML 5



Progress Bar of HTML 5

In HTML 5, the <progress> tag represents the progress of a task.

Example for the progress element of HTML 5: 

<progress max="100" value="22"> </progress>

  

Note 1: The progress tag is not supported in Internet Explorer 9 and earlier versions.


Note 2: Use the <progress> tag in conjunction with JavaScript to display the progress of a task.


Styling Progress Bar

In the stylesheet:


progress { 

  background-color: #99ff99; 
  border: 0; 
  height: 18px; 
  border-radius: 9px; }


 
Progress Bar of HTML 5
Progress Bar of HTML 5

Monday, May 2, 2016

Diawi

Diawi  

Diawi is a tool for iOS developers to deploy Development and Ad hoc iOS applications or install them directly to the device. 

Diawi  - This is Diawi
Diawi  - This is Diawi
 
Diawi
Diawi




iFunBox

iFunBox 

Launched in August 2008, iFunBox is one of the best file manager for iPhone, iPad and iPod Touch. With iFunBox you can manage files on your device just like Windows File Explorer on your PC, take advantage of the device's storage and use it as a portable USB disk, and import/export music, video, photo files with no effort. The best part about using iFunBox is it requires no jailbreak at all.

Core Feature 1: Total Control of iPhone/iPad File System
Manage files on your iPhone or iPad in a way just like Windows File Explorer but more robust and friendly. Easily transmit files and folders to your computer with the optimized file transfer and browsing. iOS 6 is now fully supported.
iPhone/iPad File Manager
Navigation, Upload and Download
Quick Preview, Drag&Drop
Delete, Rename and Move
Asian / Long Filename Support

Core Feature 2: One-stop App Install and Backup
Safe App Install on iPhone/iPad with .ipa packages. No Jailbreak required for installing purchased apps. Support install unofficial/unsigned .ipa based on AppSync without installous. Backup install apps back to .ipa packages for sharing and reinstallation. Uninstall/Install apps in batch easily.
iOS App Manager
One-step Batch uninstall/install App
Back up installed App to .ipa package
Install unsigned .ipa package
Quick App list preview

Core Feature 3: Hi-Speed General Purpose Storage
Use your iPhone as an USB Portable Disk for general files. Exploit the large flash memory on iPhone or iPad with transmission speed beyond 5MB/s on iPhone and 15MB/s on iPad.
iPhone/iPad Portable Disk
Realtime Progress Indicator
High Speed Data Transmission
Scheduled Data Transferring
Recursive Copy Subfolders

Core Feature 4: Access App sandbox without Jailbreak
Accessing App sandbox is very useful for backup/sharing saved games, app settings. You can also upload video to a 3rd-party player like “OPlayer HD” instead of using iPod, and download documents you created in an App like "Numbers".
Play with all user applications
No jailbreak needed
Backup/sharing data generated by Apps
Upload PDF, Word Doc, videos and other files for viewing/playback

Core Feature 5: Wallpaper Function
Thumbnail previewing helps a lot for wallpaper management. It is also integrated in all other folders including camera folder. Now it is more easy to seletively delete photos in the camera and replace resource images.
Upload Wallpaper in Batch
Image Thumbnail Preview
Batch Uploading and Conversion
Change Image Resolution
High Quality Image Resizing

Core Feature 6: Export Music and Movie on iPhone/iPod
Export Apple iTunes© managed audio and video in iPhone, iPad or iPods to PC as a backup copy or burn to CD. Scan and populate music and movie files in iPhone and iPod even when iTunes is refused to connect.
Backup iPod Music & Movie
iTunes© Managed Media Files
Recover Songs and Movie
Copy from iPad/iPhone
Copy to PC with Friendly Title 


 This is iFunBox
 This is iFunBox

Source: http://www.i-funbox.com/about_us.html