Data Solution Architecture remote jobs
EPAM Anywhere is looking for remote Data Solution Architecture Specialists.
Competitive compensation
We back all payments and operate in compliance with country-specific labor regulations. Your annual paycheck can range from $15k to $200k.
Numerous benefits
Make use of 1,500+ online and location-specific benefits across 30+ countries that also include healthcare and sports programs for you and your family.
Limitless career growth opportunities
Get access to a large-scale ecosystem that consists of educational and career development services created for your growth. A personal skill advisor will support you on each step of this journey.
Salesforce
Salesforce Admin Tools, Salesforce Automation Processes
40 hrs/week
12+ months
- Perform system administration and declarative automation in Salesforce environment
- Review impact of Salesforce seasonal releases and implement necessary changes
- Design innovative Salesforce solutions that integrate seamlessly with other systems and meet technical and business needs
- Lead and provide expertise on Salesforce projects, ensuring the architecture is scalable and future-proof
- Collaborate with project managers and other stakeholders to deliver projects on time
- Offer strategic guidance and technical leadership throughout the lifecycle of the project
- Keep abreast of new Salesforce features and trends to enhance service offerings and practice development
- Mentor and provide architectural expertise to other team members
- Work with product teams to help define and refine requirements
- 7+ years of Salesforce experience with a strong practical knowledge
- At least 2 projects in a technical team lead role with constant hands-on work during the last 3 years
- Proficiency in Salesforce Admin Tools and Salesforce Automation Processes
- Strong understanding of enterprise data and systems
- Experience leading and providing expertise on Salesforce projects, ensuring scalability and future-proof architecture
- Collaborative skills to work with project managers and stakeholders to deliver projects on time
- Up-to-date knowledge of new Salesforce features and trends to enhance service offerings and practice development
- Proven ability to mentor and provide architectural expertise to other team members
- Excellent communication skills to work with product teams in defining and refining requirements
- B2+ English level proficiency
Solution Architecture
JavaScript
40 hrs/week
12+ months
- Architect end-to-end solutions using JavaScript technologies, aligning with business requirements, technical constraints, and industry best practices. Develop detailed architecture blueprints, technical specifications, and implementation plans.
- Evaluate JavaScript frameworks, libraries, and tools to recommend optimal solutions based on project requirements, performance considerations, and scalability needs. Stay abreast of emerging trends and innovations in the JavaScript ecosystem to drive continuous improvement and innovation.
- Provide technical leadership and guidance to development teams, including developers, engineers, and QA analysts, throughout the software development lifecycle. Collaborate closely with stakeholders to translate business needs into technical solutions.
- Design and implement robust frontend architectures using JavaScript frameworks such as React.js, Angular, or Vue.js. Define reusable components, state management strategies, and data flow patterns to ensure scalability, maintainability, and performance.
- Architect scalable and maintainable backend solutions using JavaScript technologies such as Node.js, Express.js, or NestJS. Design RESTful APIs, data models, and database schemas to support application functionality and integration requirements.
- Lead full stack development initiatives, integrating frontend and backend components to deliver cohesive, end-to-end solutions. Define communication protocols, data formats, and authentication mechanisms to enable seamless interaction between frontend and backend systems.
- Optimize application performance, responsiveness, and reliability through efficient code design, data caching, and asynchronous processing. Conduct performance profiling, code reviews, and optimizations to enhance system scalability and user experience.
- Implement robust security measures and best practices to protect JavaScript applications from security vulnerabilities, cyber threats, and data breaches. Ensure compliance with industry regulations and standards, such as GDPR and OWASP.
- Prepare technical documentation, architecture diagrams, and best practice guides for JavaScript solutions. Provide training and knowledge transfer sessions to development teams and stakeholders on JavaScript technologies, design patterns, and coding standards.
- Bachelor's or Master's degree in Computer Science, Engineering, or related field
- 7+ years of experience in IT
- Proven experience as a Solutions Architect or similar role, with a minimum of 2 years in architecting and implementing JavaScript solutions
- Expertise in JavaScript frameworks and libraries, including React.js, Angular, Vue.js, Node.js, Express.js, or NestJS
- Proficiency in frontend development languages and technologies, including HTML, CSS, and JavaScript (ES6+)
- Experience with TypeScript, GraphQL, or serverless architectures
- Experience with backend development technologies, RESTful APIs, and database systems (e.g., MongoDB, PostgreSQL, MySQL)
- Strong understanding of software architecture principles, design patterns, and best practices
- Excellent problem-solving skills and ability to troubleshoot complex technical issues in JavaScript applications
- Effective communication and interpersonal skills, with the ability to articulate technical concepts to non-technical stakeholders
- Strong leadership abilities to lead cross-functional teams and drive collaboration towards common goals
- C1 English level proficiency
- Certification in JavaScript frameworks or architecture
- Familiarity with containerization technologies (e.g., Docker, Kubernetes)
- Contribution to open-source projects or active participation in technical communities focused on JavaScript development
Salesforce Marketing Cloud (ExactTarget)/Pardot
Salesforce, Salesforce Apex, Salesforce Data Cloud
40 hrs/week
12+ months
- Develop Salesforce configurations, VisualForce, Apex classes, APEX Web services, API, AppExchange deployment, Flow, Aura component, and Lightning web component
- Manage global and large-scale projects
- Work with Marketing Cloud to deliver innovative solutions
- Respond to changes and proactively resolve issues to deliver the best possible outcome for clients
- Collaborate with cross-functional teams using Agile methodology and product scrum teams
- Leading and managing teams of developers, ensuring successful project outcomes
- 7+ years of Salesforce experience with a strong practical development knowledge of Salesforce configurations, VisualForce, Apex classes, and APEX Web services
- Deep understanding of API, AppExchange deployment, Flow, Aura component, and Lightning web component
- Proven track record in managing global and large-scale projects
- Experience working with Marketing Cloud
- Good understanding of Agile methodology and product scrum teams
- Salesforce-certified application architect and/or Salesforce Certified System architect an advantage
- Excellent organizational, interpersonal, communication, presentation, and writing skills
- Ability to work with others in a high-paced, fluid, multi-cultural, and multi-disciplinary team
- Upper-Intermediate level of spoken and written English
- Salesforce Data Cloud experience
Data Software Engineering
Databricks, Microsoft Azure, PySpark
40 hrs/week
12+ months
- Contribute to the design and development of novel features within the Agile development framework (Scrum)
- Prioritize and uphold high-quality standards across all development stages
- Ensure the reliability, availability, performance, and scalability of systems
- Troubleshoot and maintain code within expansive and intricate environments
- Collaborate with Developers, Product and Program Management, and senior technical personnel to provide customer-centric solutions
- Offer technical insights for new feature requirements, collaborating with business owners and architects
- Stay abreast of industry trends and emerging technologies for continuous improvement
- Implement solutions aligned with business objectives
- Guide and mentor less experienced team members to foster skill enhancement and career growth
- Participate in code reviews, upholding code quality and adherence to standards
- Actively engage in architectural and technical discussions within cross-functional teams to achieve project goals
- Minimum of 3 years of hands-on experience in Data Software Engineering in a production setting
- Proficiency in Databricks, Microsoft Azure, PySpark, Python, and SQL for development and deployment to production
- Familiarity with Azure DevOps, GitHub (or alternative platforms), and version control for effective project management
- Capability to architect end-to-end production solutions
- Robust experience on one or more cloud platforms like Azure, GCP, AWS
- Proven track record in constructing resilient data pipelines
- Ability to integrate disparate elements for comprehensive solutions across systems
- Exceptional communication skills in both spoken and written English, at an upper-intermediate level or higher
- Exposure to REST APIs and Power BI would be advantageous
Data Software Engineering
Databricks, Python, Amazon Web Services
40 hrs/week
12+ months
- Build an enterprise-grade data platform
- Establish data governance and data quality
- Implement reusable Databricks components for data ingestion and analytic
- Work collaboratively with architects, technical leads and key individuals within other functional groups
- Actively participate in code review and test solutions to ensure they meet best practice specifications
- Write project documentation
- At least 3+ years of experience as a Python Developer
- Expertise in DSE, Python and Azure Databricks
- Experience with data modelling and hands-on development experience Big Date
- Cloud experience in designing, administering scalable and fault-tolerant systems
- Strong data-oriented personality and compliance awareness
- Experience with Amazon Web Services and SQL
- Upper-intermediate English level (B2+)
Data Software Engineering
Databricks, Microsoft Azure, PySpark
40 hrs/week
12+ months
- Design and develop new features using the Agile development process (Scrum)
- Prioritize and ensure high-quality standards at every stage of development
- Guarantee reliability, availability, performance, and scalability of systems
- Maintain and troubleshoot code in large-scale, complex environments.
- Collaborate with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.
- Provide technical input for new feature requirements, partnering with business owners and architects
- Ensure continuous improvement by staying abreast of industry trends and emerging technologies
- Drive the implementation of solutions aligned with business objectives.
- Mentor and guide less experienced team members, helping them enhance their skills and grow their careers
- Participate in code reviews, ensuring code quality and adherence to standards
- Collaborate with cross-functional teams to achieve project goals
- Actively contribute to architectural and technical discussions
- At least 3 years of production experience in Data Software Engineering
- Expertise in Databricks, Microsoft Azure, PySpark, Python, and SQL for building both within development and enabling deployment to production
- Experience with Azure DevOps, GitHub, (or others), and version control for effective project management
- Ability to develop end-to-end production solutions
- Strong experience working on one or more cloud platforms such as Azure, GCP, AWS
- Experience in building out robust data pipelines
- Ability to tie loose ends together for solutions across systems
- Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
- Experience with REST APIs and Power BI would be a plus
Data Software Engineering
Databricks, Python, PySpark
40 hrs/week
12+ months
- Design and develop new features using the Agile development process (Scrum)
- Prioritize and ensure high-quality standards at every stage of development
- Guarantee reliability, availability, performance, and scalability of systems
- Maintain and troubleshoot code in large-scale, complex environments.
- Collaborate with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.
- Provide technical input for new feature requirements, partnering with business owners and architects
- Ensure continuous improvement by staying abreast of industry trends and emerging technologies
- Drive the implementation of solutions aligned with business objectives.
- Mentor and guide less experienced team members, helping them enhance their skills and grow their careers
- Participate in code reviews, ensuring code quality and adherence to standards
- Collaborate with cross-functional teams to achieve project goals
- Actively contribute to architectural and technical discussions
- At least 3+ years of production experience in Data Software Engineering
- Be hands-on with deep expertise in server development in Python and PySpark
- Deep expertise in Azure Data Factory for building scalable and high-performance applications
- Experience with Advanced SQL for designing and managing database schema, including procedures, triggers, and views
- Experience in Data analysis and troubleshooting
- Knowledge of Integration testing support for version control, integration, and deployment
- Support applications and systems in a production environment, ensuring timely resolution of issues
- Reviewing requirements and translating them into a documented technical design for implementation
- Exposure to Databricks, hdinishght, azure data lake, data api, Spark, Scala, Kafka for application packaging and deployment
- Expertise in Big Data Primary skills and Data background for designing and building scalable applications
- Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
- Experience with EDL changes in DB Views/Stored procedures is a plus
Data Software Engineering
Databricks, Microsoft Azure, PySpark
40 hrs/week
12+ months
- Engage in the Agile development process (Scrum) to conceive and implement innovative features
- Prioritize and uphold high-quality standards throughout each developmental phase
- Ensure the dependability, accessibility, performance, and scalability of systems
- Troubleshoot and maintain code within expansive, intricate environments
- Work in tandem with Developers, Product and Program Management, and seasoned technical professionals to furnish customer-centric solutions
- Provide technical insights for new feature requirements in collaboration with business owners and architects
- Stay abreast of industry trends and emerging technologies for continuous improvement
- Champion the execution of solutions aligned with business objectives
- Guide and mentor less seasoned team members, fostering skill enhancement and career growth
- Participate in code reviews, ensuring adherence to standards and code quality
- Collaborate seamlessly with cross-functional teams to achieve project objectives
- Actively contribute to architectural and technical discourse
- A minimum of 3 years of hands-on experience in Data Software Engineering
- Proficiency in Databricks, Microsoft Azure, PySpark, Python, and SQL for development and deployment in production
- Familiarity with Azure DevOps, GitHub (or alternative platforms), and version control for efficient project management
- Capability to develop comprehensive end-to-end production solutions
- Robust experience on one or more cloud platforms such as Azure, GCP, AWS
- Proven track record in constructing resilient data pipelines
- Capacity to integrate disparate elements for solutions spanning multiple systems
- Exceptional communication skills in both spoken and written English, at an upper-intermediate level or higher
- Experience with REST APIs and Power BI would be an advantage
Data Software Engineering
Python.Core, Pandas, Databricks
40 hrs/week
12+ months
- Design and implement scalable, high-performance data software solutions that meet project requirements
- Develop and implement data pipelines for collecting data from various sources
- Create regular reports and dashboards using Data Analysis and Visualization tools as needed
- Write data transformations using advanced scripting
- Collaborate with cross-functional teams to conceptualize, design, develop, and implement effective data solutions
- Ensure data quality and integrity throughout the data lifecycle
- Develop and maintain technical documentation for data solutions
- Stay up-to-date with emerging trends and technologies in data software engineering
- At least 3 years of experience in Data Software Engineering
- Expertise in Python programming and core Python libraries such as Pandas
- Experience with Power BI Report Development and Jupyter Notebook
- Strong knowledge on Databricks and Amazon Web Services
- Experience with data modeling and database design
- Proven ability to design and implement scalable, high-performance data software solutions
- Familiarity with Agile methodologies and software development best practices
- Excellent written and oral communication skills in English (Upper-Intermediate level)
- Experience with data visualization tools such as Tableau or QlikView
- Familiarity with data warehousing and ETL processes
Data Software Engineering
Databricks, Microsoft Azure, PySpark
40 hrs/week
12+ months
- Design and implement scalable data pipelines to support our cutting-edge applications
- Ensure data quality and data accuracy across all stages of data processing
- Collaborate with cross-functional teams to understand business requirements and develop solutions that meet their needs
- Develop and maintain codebase in accordance with industry best practices and standards
- Troubleshoot and resolve issues in a timely and effective manner
- Optimize data processing algorithms and improve application performance
- Ensure compliance with data security and data privacy regulations
- Conduct code reviews and ensure high code quality and compliance with standards and guidelines
- Participate in architectural and technical discussions to help shape the product roadmap
- Stay up-to-date with emerging trends and technologies in data engineering and analytics
- At least 3+ years of experience as a Data Software Engineer or in similar roles
- Expertise in one of the languages (Python, Spark, PySpark, SQL) for building scalable and high-performance applications
- Experience with Microsoft Azure for cloud-based infrastructure and application management
- Experience using Databricks for building robust data pipelines
- Experience using Azure DevOps, GitHub, or other version control systems
- Familiarity with developing end-to-end production solutions
- Ability to tie loose ends together for solutions across systems
- Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
- Experience with GCP and AWS cloud platforms
- Experience with Apache Kafka and Apache Beam for building data pipelines
- Experience with machine learning and data science tools and frameworks