33 Nosql Database jobs in Kuwait
Data Engineer

Posted 2 days ago
Job Viewed
Job Description
Job Description
**Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL.** This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources-including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis. **This position is on a future contract pending award announcement.**
**Possible locations for this position are as follows:**
+ MacDill (Tampa, FL)
+ Al Udeid (Qatar)
+ Fort Meade (Maryland)
+ Northcom (Colorado Springs, CO)
+ Camp Humphreys (Korea)
+ Arifjan (Kuwait)
+ Joint Base Pearl Harbor-Hickam (Hawaii)
+ Fort Eisenhower (Georgia)
+ Offutt AFB (Omaha, NE)
+ Naval Operating Base Norfolk (Virginia)
+ Southcom (Doral, FL)
+ JB San Antonio (Texas)
+ Stuttgart (Germany)
+ Vicenza (Italy)
+ Tyndall AFB (Florida)
**Key Responsibilities:**
+ Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use.
+ Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations.
+ Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets-often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.
**Basic Qualifications:**
+ Bachelor's degree in Computer Science, Data Science, Engineering, or a related technical field
+ 8+ years of experience in data engineering or ETL pipeline development
+ Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources
+ Experience deploying in cloud environments (AWS, Azure, or GCP)
+ Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi)
+ Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL)
+ Familiarity with version control tools (e.g., Git) and collaborative DevOps practices
+ Ability to work in cross-functional teams alongside analysts, developers, and IO planners
+ Strong documentation, communication, and troubleshooting skills
+ Active TS/SCI security clearance
**Preferred Qualifications:**
+ Master's degree in a technical discipline
+ Experience supporting Information Operations, PSYOP/MISO, or WebOps
+ Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB)
+ Experience building pipelines that support real-time alerting, trend analysis, and influence mapping
+ Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js)
+ Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego)
+ Familiarity with containerized environments (Docker, Kubernetes)
+ Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.)
+ Background in API integration with social media platforms or dark web forums
EIO2024
**Original Posting:**
July 25, 2025
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
**Pay Range:**
Pay Range $104,650.00 - $189,175.00
The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
REQNUMBER: R-00163302-OTHLOC-PL-2D0099
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status. Leidos will consider qualified applicants with criminal histories for employment in accordance with relevant Laws. Leidos is an equal opportunity employer/disability/vet.
Data Engineer
Posted today
Job Viewed
Job Description
Data integration, unification, cleansing and data quality management
Job Responsibilities
Management of data inflow
Create and maintain optimal data pipeline architecture; adopt new technologies to improve existing frameworks of data flow and monitoring
Assemble large, complex data sets that meet functional / non-functional business requirements
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies; take necessary steps to implement necessary changes in the IT infrastructure such as MDM tool acquisition, data lake design, cloud solutions implementation in coordination with IT and technology coordinator
Create data tools for analytics and data scientist team members that assist them in building models by automating and simplifying data preparation
Translate customer data strategy into actionable data integration plans and execute these plans
Maintain 360 degree view of customer, enhance the customer datamarts by continuously integrating new sources of data
Data cleansing and unification
Create automated data anomaly detection systems and constant tracking of its performance
Process, cleanse, and verify integrity of data used for analysis; active use of built-in data quality dashboard on CDP and coordination of corrective actions
Develop algorithms to de-duplicate and export customer data from multiple BUs to ensure data unification
Ensure continuous unification of customer records and associated profile and transactional data
- Understanding of data modeling principles
- 5+ years of significant configuration and data management experience; comfortable handling and manipulating data, with demonstrated experience in a data-intensive setting
- Experience building highly scalable and high performing databases using modern NoSQL or cloud based technologies and integrating external APIs for data acquisition
- Experience identifying data anomalies or imperfections
- Experience of data structures and schemas, data preparation, cleansing and unification
- Possess the ability to train others - pass on knowledge within the organization
Education
IoT Data Engineer
Posted 5 days ago
Job Viewed
Job Description
Join to apply for the IoT Data Engineer role at Canonical
2 days ago Be among the first 25 applicants
Join to apply for the IoT Data Engineer role at Canonical
Canonical is a leading provider of open source software and operating systems to the global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation, and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in 75+ countries and very few office-based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution.
The company is founder-led, profitable, and growing.
This is an exciting opportunity for a software engineer passionate about open source software, Linux, and Web Services at scale. Come build a rewarding, meaningful career working with the best and brightest people in technology at Canonical, a growing pre-IPO international software company.
Canonical's engineering team is at the forefront of the IoT revolution and aims to strengthen this position by developing cutting-edge telemetry and connectivity solutions. By integrating reliable, secure, and robust data streaming capabilities into the Snappy ecosystem, we are setting new standards in the industry for ease of development, implementation, management and security.
We are seeking talented individuals to help us enhance our global SaaS services, providing customers with the essential data services needed to build the next generation of IoT devices effortlessly. Our commitment to data governance, ownership, and confidentiality is unparalleled, ensuring our customers can innovate with confidence on top of the globally trusted Ubuntu platform.
Location: This role will be based remotely in the EMEA region.
What your day will look like
- Work remotely with a globally distributed team, driving technical excellence and fostering innovation across diverse engineering environments.
- Design and architect high-performance service APIs to power streaming data services, ensuring seamless integration across teams and products using Python and Golang.
- Develop robust governance, auditing, and management systems within our advanced telemetry platform, ensuring security, compliance, and operational integrity.
- Partner with our infrastructure team to build scalable cloud-based SaaS solutions while also delivering containerized on-prem deployments for enterprise customers.
- Lead the design, implementation, and optimization of new features—taking projects from spec to production, ensuring operational excellence at scale.
- Provide technical oversight, review code and designs, and set best practices to maintain engineering excellence.
- Engage in high-level technical discussions, collaborating on optimal solutions with engineers, product teams, and stakeholders.
- Work remotely with occasional global travel (2-4 weeks per year) for internal and external events, fostering deeper collaboration and knowledge-sharing.
- You design and architect scalable backend services, messaging/data pipelines, and REST APIs using Golang or Python, guiding best practices, technical direction, and system scalability.
- You possess deep expertise in cybersecurity principles and proactively address the complex challenges of IoT environments—secure connectivity, data streaming, governance, and compliance.
- You bring proven expertise in designing and optimizing systems using:
- IAM models, encryption, access control, and compliance frameworks (GDPR, HIPAA) to ensure secure and compliant data handling.
- Ability to design decentralized data ownership models, ensuring interoperability and governance across domains.
- Designing high-throughput, low-latency systems for IoT data processing.
- Data streaming technologies (MQTT, Kafka, RabbitMQ)
- Observability tools (OpenTelemetry)
- Industrial/engineering data exchange protocols (OPC-UA, ModBus)
- You thrive in cross-functional environments, partnering with product teams, engineers, and stakeholders to drive high-impact technical solutions that align with business objectives.
- You mentor junior engineers, foster technical excellence, and contribute to a culture of innovation, continuous improvement, and knowledge sharing.
- You embrace challenges with an open mind, continuously seeking opportunities to learn, improve, and innovate in a rapidly evolving IoT landscape.
- You are familiar with Ubuntu as a development and deployment platform.
- You hold a Bachelor's degree or equivalent in Computer Science, STEM, or a related field.
- Willingness to travel up to 4 times a year for internal events.
We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally.
- Distributed work environment with twice-yearly team sprints in person
- Personal learning and development budget of USD 2,000 per year
- Annual compensation review
- Recognition rewards
- Annual holiday leave
- Maternity and paternity leave
- Team Member Assistance Program & Wellness Platform
- Opportunity to travel to new locations to meet colleagues
- Priority Pass and travel upgrades for long-haul company events
Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open-source projects and the platform for AI, IoT, and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence; in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since our inception in 2004. Working here is a step into the future and will challenge you to think differently, work smarter, learn new skills, and raise your game.
Canonical is an equal opportunity employer
We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.
Seniority level
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology
- Industries Software Development
Referrals increase your chances of interviewing at Canonical by 2x
Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics Python and Kubernetes Software Engineer - Data, AI/ML & Analytics Software Engineer - Python - Container Images Software Engineer (Python/Linux/Packaging) Software Engineer - Python - Container Images Software Engineer - Python - Container Images Graduate Software Engineer, Open Source and Linux, Canonical Ubuntu Distributed Systems Software Engineer, Python / Go Junior Software Engineer - Cross-platform C++ - Multipass Lead Python Software Engineer, Commercial Systems Senior Software Engineer - Python/MongoDB Python Software Engineer - Ubuntu Hardware Certification Team System Software Engineer - GCC/LLVM compiler, tooling, and ecosystem Software Engineer - Solutions Engineering Software Engineer - Data Infrastructure - Kafka System Software Engineer - Python interpreter, tooling, and ecosystem Software Engineer - Data Infrastructure - OpenSearch/ElasticSearch Software Engineer - Immutable Ubuntu Desktop Software Engineer, Ceph & Distributed StorageWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrJunior Data Engineer
Posted 5 days ago
Job Viewed
Job Description
The role of a Junior
Data Engineer
at Canonical
Canonical has provided developers with open source since 2004, helping them build innovations such as public cloud, machine learning, robotics or blockchain. Marketing at Canonical means being at the forefront of innovation, for our customers and for our own martech stack. We're on the look out for a marketing data analyst to join our team and execute on our growth hacking strategy.
The ideal candidate will be passionate about technology, technology marketing and the use of technology in marketing. You will prefer to work in an environment that has emphasis on ownership of campaigns, collaboration, learning, curiosity and a drive to continually improve oneself / the team / the organisation. You will also love to problem solve, get hands-on, experiment, measure and use automation to make daily life easier.
The Marketing team at Canonical drives commercial outcomes for the company across its portfolio of products and grows the addressable market through digital marketing campaigns, lifecycle management, events, partnerships and community development. If these things are important to you and you're motivated by driving data engineering, delighting customers and filling the sales funnel, we want to talk with you.
This role sits in the Marketing team reporting to the Growth Engineering Manager.
Location:
This role will be based remotely in the EMEA region.
What your day will look like
Utilise advanced data analytics to grow Canonical's product adoption and market penetration Focus on quantitative and qualitative data analytics to find insights and meaningful business outcomes Design and conduct experiments with data, visualisation and insights into Canonical's target audiences Collaborate with stakeholder teams (Product Management, Engineering, Information Systems, Finance, RevOps, etc) to improve the data and tool ecosystem Put in place and maintain systems to ensure teams across the company have self-service access to data dashboards
What we are looking for in you?
Background in data science, mathematics, actuarial science, or engineering Knowledge in advanced statistics, data sciences, coding/scripting languages (Python, JS, etc), and databases (SQL, etc) Strength in data analytics and visualisation (Looker Studio, Tableau, Apache Superset, etc) Ability to translate business questions to key research objectives Ability to identify the best methodology to execute research, synthesise and analyse findings Excellent writing and communication skills Willingness to examine the status quo and resilient in the face of challenges
What we offer you
Your base pay will depend on various factors including your geographical location, level of experience, knowledge and skills. In addition to the benefits above, certain roles are also eligible for additional benefits and rewards including annual bonuses and sales incentives based on revenue or utilisation. Our compensation philosophy is to ensure equity right across our global workforce.
In addition to a competitive base pay, we provide all team members with additional benefits, which reflect our values and ideals. Please note that additional benefits may apply depending on the work location and, for more information on these, you can ask in the later stages of the recruitment process.
Fully remote working environment - we've been working remotely since 2004!
Personal learning and development budget of 2,000 USD per annum
Annual compensation review
Recognition rewards
Annual holiday leave
Parental Leave
Employee Assistance Programme
Opportunity to travel to new locations to meet colleagues at 'sprints'
️Priority Pass for travel and travel upgrades for long haul company events
About Canonical
Canonical is a pioneering tech firm that is at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open source projects and the platform for AI, IoT and the cloud, we are changing the world on a daily basis. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence - in order to succeed, we need to be the best at what we do.
Canonical has been a remote-first company since its inception in 2004. Work at Canonical is a step into the future, and will challenge you to think differently, work smarter, learn new skills, and raise your game. Canonical provides a unique window into the world of 21st-century digital business.
Canonical is an equal opportunity employer
We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.
#J-18808-Ljbffr
IoT Data Engineer
Posted 12 days ago
Job Viewed
Job Description
IoT Data Engineer
role at
Canonical 2 days ago Be among the first 25 applicants Join to apply for the
IoT Data Engineer
role at
Canonical Canonical is a leading provider of open source software and operating systems to the global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation, and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in 75+ countries and very few office-based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution.
The company is founder-led, profitable, and growing.
This is an exciting opportunity for a software engineer passionate about open source software, Linux, and Web Services at scale. Come build a rewarding, meaningful career working with the best and brightest people in technology at Canonical, a growing pre-IPO international software company.
Canonical's engineering team is at the forefront of the IoT revolution and aims to strengthen this position by developing cutting-edge telemetry and connectivity solutions. By integrating reliable, secure, and robust data streaming capabilities into the Snappy ecosystem, we are setting new standards in the industry for ease of development, implementation, management and security.
We are seeking talented individuals to help us enhance our global SaaS services, providing customers with the essential data services needed to build the next generation of IoT devices effortlessly. Our commitment to data governance, ownership, and confidentiality is unparalleled, ensuring our customers can innovate with confidence on top of the globally trusted Ubuntu platform.
Location: This role will be based remotely in the EMEA region.
What your day will look like
Work remotely with a globally distributed team, driving technical excellence and fostering innovation across diverse engineering environments. Design and architect high-performance service APIs to power streaming data services, ensuring seamless integration across teams and products using Python and Golang. Develop robust governance, auditing, and management systems within our advanced telemetry platform, ensuring security, compliance, and operational integrity. Partner with our infrastructure team to build scalable cloud-based SaaS solutions while also delivering containerized on-prem deployments for enterprise customers. Lead the design, implementation, and optimization of new features—taking projects from spec to production, ensuring operational excellence at scale. Provide technical oversight, review code and designs, and set best practices to maintain engineering excellence. Engage in high-level technical discussions, collaborating on optimal solutions with engineers, product teams, and stakeholders. Work remotely with occasional global travel (2-4 weeks per year) for internal and external events, fostering deeper collaboration and knowledge-sharing.
What we are looking for in you
You design and architect scalable backend services, messaging/data pipelines, and REST APIs using Golang or Python, guiding best practices, technical direction, and system scalability. You possess deep expertise in cybersecurity principles and proactively address the complex challenges of IoT environments—secure connectivity, data streaming, governance, and compliance. You bring proven expertise in designing and optimizing systems using: IAM models, encryption, access control, and compliance frameworks (GDPR, HIPAA) to ensure secure and compliant data handling. Ability to design decentralized data ownership models, ensuring interoperability and governance across domains. Designing high-throughput, low-latency systems for IoT data processing. Data streaming technologies (MQTT, Kafka, RabbitMQ) Observability tools (OpenTelemetry) Industrial/engineering data exchange protocols (OPC-UA, ModBus) You thrive in cross-functional environments, partnering with product teams, engineers, and stakeholders to drive high-impact technical solutions that align with business objectives. You mentor junior engineers, foster technical excellence, and contribute to a culture of innovation, continuous improvement, and knowledge sharing. You embrace challenges with an open mind, continuously seeking opportunities to learn, improve, and innovate in a rapidly evolving IoT landscape. You are familiar with Ubuntu as a development and deployment platform. You hold a Bachelor's degree or equivalent in Computer Science, STEM, or a related field. Willingness to travel up to 4 times a year for internal events.
What we offer colleagues
We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally.
Distributed work environment with twice-yearly team sprints in person Personal learning and development budget of USD 2,000 per year Annual compensation review Recognition rewards Annual holiday leave Maternity and paternity leave Team Member Assistance Program & Wellness Platform Opportunity to travel to new locations to meet colleagues Priority Pass and travel upgrades for long-haul company events
About Canonical
Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open-source projects and the platform for AI, IoT, and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence; in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since our inception in 2004. Working here is a step into the future and will challenge you to think differently, work smarter, learn new skills, and raise your game.
Canonical is an equal opportunity employer
We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.
Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Information Technology Industries Software Development Referrals increase your chances of interviewing at Canonical by 2x Python and Kubernetes Software Engineer - Data, Workflows, AI/ML & Analytics
Python and Kubernetes Software Engineer - Data, AI/ML & Analytics
Software Engineer - Python - Container Images
Software Engineer (Python/Linux/Packaging)
Software Engineer - Python - Container Images
Software Engineer - Python - Container Images
Graduate Software Engineer, Open Source and Linux, Canonical Ubuntu
Distributed Systems Software Engineer, Python / Go
Junior Software Engineer - Cross-platform C++ - Multipass
Lead Python Software Engineer, Commercial Systems
Senior Software Engineer - Python/MongoDB
Python Software Engineer - Ubuntu Hardware Certification Team
System Software Engineer - GCC/LLVM compiler, tooling, and ecosystem
Software Engineer - Solutions Engineering
Software Engineer - Data Infrastructure - Kafka
System Software Engineer - Python interpreter, tooling, and ecosystem
Software Engineer - Data Infrastructure - OpenSearch/ElasticSearch
Software Engineer - Immutable Ubuntu Desktop
Software Engineer, Ceph & Distributed Storage
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
Required Data Engineer - Confidential
Posted today
Job Viewed
Job Description
Required Data Engineer
- Bachelor degree
- Must have completed one of the following certifications:
- Data Engineering certification
- Database Administrator certification (Oracle or SQL server)
- Must have 8+ yrs. of Experience in IT Industry with 3+ yrs. experience in software development and administering database.
- Must have 3+ yrs. experience as Data engineer.
- Technical expertise in data models, data mining, and segmentation techniques
- Experience in database design
Kindly check the qualifications carefully before applying.
To apply, please send your resume to:
confidential
confidential
'''
This job has been sourced from an external job board.
Sr. Data Engineer - Alghanim Industries
Posted today
Job Viewed
Job Description
- Oversee and manage various aspects of data integration and data warehouse processes. This includes implementing data quality measures, designing and deploying integration solutions, collaborating with team members, troubleshooting production issues, refining best practices, educating end users, participating in code reviews, exploring, and adopting new technologies, leading prototype development, and ensuring adherence to industry standards. The role involves working within Agile teams, reporting on project status, and partnering with IT Solution Architects to optimize data integration designs. The goal is to contribute to efficient and effective data analytics integration solutions while staying informed about technological trends and mentoring team members.
Job Responsibilities
- Operational Work - 75% of time:
- Implement Data Warehouse data quality processes. Provide solutions where issues are identified
- Design, develop, test, and deploy data integration solutions based on business requirements
- Collaborate with team members on data integration designs and alternatives
- Troubleshoot and resolve production issues
- Continually refine and document data integration best practices
- Educate end users (where applicable) in the use of metadata and understanding where data originates
- Participate in code reviews
- Identify, evaluate, and refine leading-edge data integration tools and methods
- Build prototypes and POCs
- Bring a passion to stay on top of tech trends, experiment with, and learn new technologies, participate in internal & external technology communities, and mentor other members of the engineering community
- Own the development of cross-functional, multi-platform prototypes
- Work within and across Agile teams to design, develop, test, implement, and support data analytics integration solutions
- Document and support deployment activities
- Report on status of IT Data Operations products and projects on a regular and timely basis
- Ensure the timely resolution of issues
- Follow standards and guidelines according to industry best practices
- Escalate anticipated risks to management promptly and properly
- Partner with IT Solution Architects regarding data integration design options
- Strategic Work - 25%:
- Educate and inform yourself and team members about data integration technology solutions and emerging trends
- Continually participate in visioning exercises and contribute to the data strategy
- Required years of relevant experience:
- 4+ years of experience designing, developing, and maintaining data integration solutions
- Desired Qualifications:
- Education - B.S. in Computer Science or related discipline.
- Advanced degree a plus
- Specialized Skills:
- Required:
- Data integration development experience with the following skill sets:
- Extract, Transformation, and Load (ETL) of disparate data sources to targets including databases, files, and API’s. Talend, Matillion, Glue Data Studio preferred
- Ability to detect, identify, and resolve source to target performance / load issues
- Expert level proficiency in structured query language (SQL)
- Data Warehousing
- Thorough understanding of relational data models including third normal form and fact / dimension designs
- Data marts
- Oracle database platform features including indexing, partitioning, and PL/SQL
- AWS Data Migration System (DMS), Amazon Redshift, Aurora Postgres Expertise including but not limited to performance tuning
- Linux O/S commands and scripting
- Secure File Transfer Protocol (SFTP)
- Desire to work in the data integration space (data warehouse, data lake, AWS, data API’s) while continuing to grow skill sets
- Desired:
- Modern development practices - e.g. DevOps
- Experience with open source database platforms:
- PostgreSQL
- MySQL
- MariaDB
- Data Lake integration design and development (ETL & ELT Methodologies)
- Experience with data quality tools
- Experience with AWS platform:
- o DMS
- o Glue
- o Redshift
- o S3, etc
- AWS Certifications
- Experience in API development
- Master data management integration and concepts
- Experience with job scheduling solutions such as Airflow
- Education Bachelor’s Degree in Computer Science
This job has been sourced from an external job board.
Be The First To Know
About the latest Nosql database Jobs in Kuwait !
Senior Data Engineer - Lynx Analytics
Posted today
Job Viewed
Job Description
- Understand the business problems we solve for our clients. This role involves frequent communication with our clients and a close working relationship with the Data Scientist.
- Discover the client's existing data sources that are relevant to the problems we try to solve. This includes discussions with client IT, data owners, future business users, etc.
- Together with IT employees of our clients, decide on the technical architecture for the ETL solution.
- Implement the data ingestion subsystem: this is the system responsible for moving all the necessary data sources to a single location where the actual analysis will happen.
This job has been sourced from an external job board.
Urgent Requirement for Data Engineer for Our
Posted today
Job Viewed
Job Description
Big Data platforms and EDW (data engineering, data quality, data operations, BI/Data Warehouse/Data Lake)
- Must Dimensional Modeling
- Must ETL Tools (Microsoft SSIS and Informatica (BDM))
- Must Advance Microsoft Excel
- Must Banking Domain
- Must
**Roles and responsibilities**:
- Understand the business requirements, business processes, and system functionality to develop the data design and flow.
- Utilize best practices for standard naming conventions, standard coding practices, architecture and ETL guidelines to ensure consistency of data models and department outputs.
- Develop and maintain the approved 'Enterprise Business Terminology', 'Definition Document', 'Business Glossary', and integration with metadata.
- Implement processes and logic to extract, transform, and load data across one or more data stores from a wide variety of sources.
- Optimize data integration platform to provide optimal performance under increasing data sources and volumes.
- Include source validation, target validation, quality checks, technical metadata, and logging for each ETL Process.
- Responsible for reviewing, supporting, setting, controlling the daily extraction window, maintaining the process flows, scheduling the jobs, and identifying the dependencies.
- Automate the ETL process and flow to minimize manual intervention and ensuring the availability and quality of the data.
- Ensure compliancy with company Internal/External security, risk, and audit.
- Document all the required information and processes.
- Responsible for deployment into the production ensuring compatibility, adherence to guidelines and avoiding problems to the production system.
- Share knowledge, coach, and mentor co-workers.
- Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
- Work with data architecture to define technical integration requirements to supply the needed source data.
- Support data query needs from multiple partners within the Data Group unit and business units.
- Support user community on data analytics.
- Support data science team by helping with required data preparation and setting up data pipelines for AA models.
- Keep close engagement with the BI team to optimize models and connection patterns for maximum performance.
- Create and maintain detailed documentation for the development team.
- Support API documentation of classes, methods scenarios, code, design rationales, and contracts.
- Responsible for the library of all deployed Application Programming Interfaces (APIs).
**Key Responsibilities**:
- Ingest, transform, and store clean and enriched data in ready for business intelligence consumption.
- Design, model and maintain the big data platform, data warehouses, and data marts.
- Maintain, configure, and Monitor Data Engineering and ETL tools (e.g., Informatica (BDM), SSIS).
**Required Qualifications**:
- Experience with Big Data platforms and EDW (data engineering, data quality, data operations, BI/Data Warehouse/Data Lake).
- Hands-on experience with various input files (standard CSV, delimited, fixed width, JSON).
- Strong knowledge of emerging technologies and tools.
- Strong problem-solving skills.
- Advanced knowledge in SQL.
- Advanced Knowledge in Database Concepts and Design.
- Advanced Knowledge in Data Warehouse Concepts and Design.
- Advanced Knowledge in Dimensional Modeling.
- Advanced knowledge in Programming Languages.
- Advanced experience in ETL Tools (Microsoft SSIS and Informatica (BDM)).
- Advanced knowledge in Microsoft Excel.
- Strong technical communication & writing skills.
- Advanced understanding of the Banking business.
- 2+ years of experience in data engineering.
- At least two years of ETL design, development, and performance tuning on RDBMS like SQL Server.
- Bachelor Degree in Computer Science, Computer Engineering, or IS/MIS.
Request you to kindly send us your word formatted updated resume ASAP along with the below mentioned details:
- Total Experience
- Current Salary
- Expected Salary (in KWD)
- Notice period
- Contact Number
- Date of Birth, Gender, Nationality
- Marital Status
- Current Location
Are you ready to relocate to Kuwait?
The position is on Talent Arabia deputed to bank, is it fine with you?
How many years of experience do you have as a Data Engineer?
How many years of experience on Big data platform?
How many years of experience on Enterprise Data Platform?
How many years of experience do you have in the following skills:
Input files (standard CSV, delimited, fixed width, JSON)
- Must Dimensional Modeling
- Must ETL design, development and performance tuning
- Must ETL Tools (Microsoft SSIS)
- Must Informatica (BDM)
- Must Advance Microsoft Excel
- Must Banking Domain
- Must
This job has been sourced from an external job board.
Data Engineer Btelligent Gmbh Co Kg - Devjobs
Posted today
Job Viewed
Job Description
- Ensure high data quality by implementing data pipeline testing and monitoring
- Collaborate with the team to build and optimize data transformation processes using Dataform, Airflow, and other GCP data engineering tools
- Write clean, efficient Python code for data extraction, transformation, and loading (ETL)
- Harness the power of SQL to query and manipulate data
- Working with BI tools to create insightful data visualizations and dashboards
- Work closely with our international team
This job has been sourced from an external job board.