Data Engineer

Apply
  • Milwaukee
  • Full-time
  • Information Technology

Company Description

Expert Institute is an established legal services and technology company. At our core, we match attorneys with the expertise and resources they need to win complex litigation. We’ve been at it for more than a decade and call some of the nation’s most prominent law firms as clients. 

Central to our service and product delivery is Expert iQ, a web-based software solution launched in 2018. Expert iQ helps streamline collaboration between attorneys, experts, and team members at Expert Institute. It provides a repository of work history, a hub for collaboration, and a strategic delivery platform for our new and expanding content initiatives.

Job Description

We are looking for a Data Engineer that is eager to make a significant impact on our overall data strategy from concept to production. Data applies to our customers and prospects, as well as our expert network - candidates we’ve worked with in the past - and experts in a given field who’ve never heard of the Expert Institute, but could be impactful on our clients’ cases. 

While you always take a data-first approach towards making improvements, you also have a deep creative streak and look for ways to be innovative. You are able to work seamlessly with contributors and stakeholders, company-wide, to produce meaningful projects that align with broader business goals. Most importantly, you enjoy making your ideas a reality in a fast-paced environment where your work can have a direct impact on the company as a whole. 

WHAT YOU'LL DO

The Data Engineer reports to the Chief Financial Officer who manages the Strategy & Analytics department, and will work closely with other Executives across the company. 

  • Manage APIs with existing partner’s databases 

  • Develop analytics that demonstrate value and deep insights to our clients

  • Compile information for licensing with new databases and vendors

  • Create statistical packages that will resonate with clients and withstand questioning 

  • Develop web crawlers to collect data across thousands of websites and databases

  • Match and sanitize large data sets against existing databases 

  • Manage and create analytics dashboards to be presented to clients in ExpertiQ

Qualifications

  • B.A./B.S. degree or certification in an applicable field or program

  • 1-3 years of experience in large scale data collection and analysis

  • High proficiency in Python, PHP, RegEx, SQL, MySQL, SoQL, HTML, CSS, Excel

  • Experience with web crawling tools: ScrapingHub, Grepsr, ParseHub, Diffbot

  • General database expertise

  • Salesforce experience a big plus

  • Excellent organizational skills and attention to detail

  • Ability to work independently and own projects through completion

Additional Information

All your information will be kept confidential according to EEO guidelines.