Job Details
ETL/Data Engineer/Data Architect
ETL/Data Engineer/Data Architect
Location: Remote work
Must have a active Secret Security Clearance
Job Description
Node is currently seeking a motivated, career and customer-oriented Senior Data Architect to begin an exciting and challenging career with our large Enterprise Application Support Program on one of our project delivery teams.
Job Responsibilities
· Design and implement effective database structures and models to store, retrieve, and analyze data.
· Develop, construct, test, and maintain scalable data pipelines to collect, process, and integrate data from various sources.
· Implement ETL (Extract, Transform, Load) processes to ensure data consistency and quality.
· Integrate data from different sources, ensuring consistency, reliability, and accuracy.
· Develop data APIs and automation scripts to streamline data integration and workflows.
· Monitor and optimize database and data processing system performance.
· Conduct performance tuning and troubleshoot data issues.
Requirements
Required:
· Bachelor's degree in Computer Science, Management Information Systems, or relevant discipline (4 years of equivalent experience)
· 8+ years' experience with:
o Proven experience as a Data Architect, Data Engineer, or in a similar role.
o Extensive experience in designing and implementing data architectures.
Hands-on experience in developing and managing data pipelines and ETL processes.
Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, SQL Server).
o Experience with big data technologies (e.g., Hadoop, Spark) and ETL tools.
o Strong programming skills in languages such as Python, Java, or Scala.
Security Clearance Requirements
· Must be a U.S. citizen
· Ability to obtain an IRS MBI (Minimum Background Investigation) Security Clearance from the Federal Agency.
· Active IRS MBI Clearance is highly desirable
Company Overview:
Node.Digital is an independent Digital Automation & Cognitive Engineering company that integrates best-of-breed technologies to accelerate business impact.
Our Core Values help us in our mission. They include:
OUR CORE VALUES
Identifying the~RIGHT PEOPLE~and developing them to their full capabilities
Our customer's “Mission” is our “Mission”. Our~MISSION FIRST~approach is designed to keep our customers fully engaged while becoming their trusted partner
We believe in~SIMPLIFYING~complex problems with a relentless focus on agile delivery excellence
Our mantra is “~Simple*Secure*Speed~” in delivery of innovative services and solutions
Benefits
We are proud to offer competitive compensation and benefits packages to include
- Medical
- Dental
- Vision
- Basic Life
- Health Saving Account
- 401K
- Three weeks of PTO
- 10 Paid Holidays
- Pre-Approved Online Training
Registration Required
Please login or register to save jobs.
Login to Apply
You are leaving our site
This job is available through Jobs 2 Careers, a third party site.
Do you want to continue?
Registration Required
Please login or register to save jobs.
Email this job to a friend
Job Alert Sign Up
Yes, sign me up! You will receive job alert emails.
Add To Job Alert
Confirm the addition of the following job categories to your alert?
Job Alert Updated
You have been signed up to receive jobs like this one in your job alert.
Email Customer Care
Neep Help? Have questions or want to provide feedback? Please fill out this form and we will get in touch with you as soon as possible. Thanks!
Freqently Asked Questions
Mastering SQL and database management systems like MySQL or PostgreSQL is essential. Additionally, hands-on experience with big data tools such as Hadoop or Spark, and proficiency in programming languages like Python or Java, are critical to architect resilient data pipelines and scalable ETL processes.
Starting as data engineers focusing on pipeline development, professionals often progress to architect roles where strategic data modeling and integration design become central. Seniority brings leadership in managing complex data ecosystems, driving innovation, and mentoring junior engineers within data-driven enterprises.
Remote data architects must balance efficient cross-team communication with the technical demands of maintaining data integrity. Troubleshooting integration complexities and ensuring pipeline performance remotely requires self-discipline and advanced automation skills to uphold data reliability from diverse sources.
Local employers in Pearl River value candidates with active security clearances due to government-linked projects. Certifications in cloud platforms and big data tools complement the clearance requirement, positioning candidates competitively in a market that blends enterprise data solutions with secure data handling.
Better Northshore Jobs emphasizes a mission-first approach, integrating data roles deeply into customer-driven projects. Data Architects there are expected to simplify complex systems while ensuring security and speed, aligning with the company's dedication to agile delivery and trusted partnership.
Beyond demanding active secret security clearance, these roles at Better Northshore Jobs focus on cognitive automation and digital integration, calling for a blend of technical expertise and a mission-driven mindset. Employees engage in large-scale enterprise applications with a strong emphasis on innovative, secure solutions.
Typically, data engineers and architects in Pearl River earn between $110,000 and $140,000 annually, reflecting regional cost of living and market demand. This range aligns well with national averages for senior data roles, especially when factoring in specialized security clearance requirements.
Obtaining an IRS MBI or secret clearance involves thorough background checks and can take several months. Maintaining clearance requires ongoing compliance with federal standards. It's a critical hurdle that also enhances job security and access to sensitive, high-impact data projects.
A frequent misunderstanding is that ETL/Data Architects only handle data pipelines, but their role extends to strategic data design, performance tuning, and automation scripting. They ensure data accuracy and security, often acting as a pivotal bridge between business needs and technical infrastructure.
Python is widely favored for its rich libraries and versatility in data tasks, while Java and Scala are preferred for scalable, performance-intensive applications. Choosing the right language depends on the specific integration requirements and the existing data ecosystem within the organization.
Find The Related Jobs
Kforce
Data Engineer
Merrimack, NH
A-line Staffing Solutions Llc
Automation Data Engineer
Denver, NC
A-line Staffing Solutions Llc
Automation Data Engineer
Lowell, NC
A-line Staffing Solutions Llc
Automation Data Engineer
Mooresville, NC
Kforce
Data Engineer SME
Huntsville, AL
V-soft Consulting Inc
Data Engineer IV
Cincinnati, OH