Summary
Overview
Work History
Education
Skills
Websites
Certification
Timeline
Generic
Dinesh Amilineni

Dinesh Amilineni

Summary

Dynamic and results-driven Data Architect and AI Specialist with over 14 years of experience designing and implementing cutting-edge data solutions for global financial institutions. Proficient in cloud-native architectures, data governance, and Generative AI (GenAI), with a proven ability to optimize costs, enhance system performance, and lead high-performing teams. Expertise spans translating business needs into scalable, secure data platforms, leveraging Azure, AWS, and big data ecosystems. Passionate about driving innovation through ethical AI practices and robust data management strategies.

Overview

14
14
years of professional experience
1
1
Certification

Work History

Senior solution architect

Manulife
08.2023 - Current
  • Developed and executed enterprise-wide GenAI strategic plan, aligning with corporate strategy to enhance customer experience and operational efficiency.
  • Led design and implementation of scalable GenAI solutions on Azure, integrating real-time analytics,
  • Identified high-impact GenAI opportunities, deploying RAG-powered chatbot and Text to sql with AI gateway capabilities helping in improving user experience .
  • Ensured data integrity and security for GenAI applications, collaborating with compliance teams, implementing robust governance frameworks.
  • Drove GenAI adoption through training and upskilling programs, enhancing staff digital capabilities.
  • Navigated institutional structures to align GenAI initiatives with corporate and regulatory requirements.
  • Led cross-functional teams in GenAI solution development, managing timelines and deliverables.
  • Stayed updated on GenAI trends, introducing advancements to the team.
  • Led the design and implementation of a scalable and cost-optimized Data Lake and Data Warehouse architecture on Azure, incorporating an Operational Data Store (ODS) layer for real-time data integration, and enabling near real-time analytics, resulting in a Target Architecture that supports advanced analytics, machine learning workloads, and Generative AI applications. This data architecture resulted in a 30% reduction in cloud infrastructure costs within the first year.
  • Architected data platforms foundational for GenAI applications, ensuring scalability and performance.
  • Optimized data workflows, achieving 30% faster query speeds and 20% reduced processing times.
  • Led 400TB data migration with zero loss and downtime, ensuring data integrity and security.
  • Translated complex business requirements into robust and scalable data solutions, collaborating with stakeholders across business units to ensure alignment with strategic objectives and deliver actionable insights, including designing solutions leveraging Change Data Capture (CDC) to ensure timely and accurate data replication from source systems for both analytical and operational use cases. This included the exploration and prototyping of Generative AI solutions to enhance customer experience and operational efficiency.
  • Architected and implemented data ingestion pipelines for diverse data sources (Oracle, DB2, Salesforce, etc.) into the Enterprise Data Lake ensuring adherence to Data Governance policies and Data Quality initiatives. Optimized these pipelines for efficient data preparation and feature engineering required for Machine Learning and GenAI models.
  • Optimized ETL/ELT processes using Azure Data Factory and Databricks, resulting in a 35% improvement in data processing times compared to previous Cloudera-based workflows, enabling faster generation of critical sales reports and improving decision-making.
  • Engineered and implemented comprehensive data retention and Data Governance policies, ensuring compliance with regulatory requirements and reducing data storage costs by 15% through effective data lifecycle management. Established robust Data frameworks for AI/ML models, mitigating compliance risks and improving data quality for critical business intelligence.
  • Mentored and guided a team of 7 data engineers and analysts, fostering a culture of continuous learning and improving team productivity by 15% through targeted training programs and knowledge-sharing initiatives, resulting in a 10% improvement in team performance and the successful adoption of MLOps, LLMOps, and GenAI technologies.
  • Designed and implemented an MLOps platform on Azure using Azure Databricks Unity Catalog, MLflow, and automating the end-to-end machine learning lifecycle from model training and validation to deployment and monitoring, significantly reducing model deployment time and improving model iteration speed
  • Evaluated and recommended LLMOps tools and best practices for managing the lifecycle of Large Language Models, including model fine-tuning, deployment, monitoring, and security, to ensure the responsible and efficient operation of GenAI applications.
  • Contributed to the development of a GenAI strategy for the organization, identifying key use cases, evaluating platform options, and defining data requirements and governance frameworks for successful GenAI adoption
  • Implemented disaster recovery plans that ensured minimal service disruptions during unforeseen events or system failures.
  • Led cross-functional teams to deliver complex IT solutions on schedule and within budget, ensuring client satisfaction and repeat business.

Senior data engineer

Manulife
04.2018 - 07.2023
  • Spearheaded the implementation of CI/CD pipelines for diverse applications, automating code deployment and infrastructure changes, resulting in a 40% reduction in deployment time and a 25% decrease in deployment-related errors. Leveraged NiFi registry and toolkit to administer automated deployments, ensuring consistency and reliability across environments.
  • Established and streamlined routine maintenance procedures for critical data infrastructure, proactively identifying and resolving system errors, improving system uptime by 15% and reducing critical incidents by 30%. Created comprehensive maintenance documentation and troubleshooting guides, enhancing team efficiency and knowledge sharing.
  • Led performance evaluation and optimization efforts for existing data applications and platforms, identifying bottlenecks and recommending performance enhancements that resulted in a 30% improvement in query speeds and a 20% reduction in data processing times, enabling faster generation of critical sales reports and improving decision-making.
  • Analyzed tool usage patterns across data engineering teams and implemented best practices for efficiency, leading to a 15% increase in team productivity and a 10% reduction in operational costs. Developed and rolled out standardized practices for tool utilization, code deployment, and environment management, fostering a culture of efficiency and continuous improvement.
  • Streamlined code deployment coordination processes, proactively troubleshooting deployment issues and collaborating effectively with development teams to ensure smooth and timely releases. Reduced deployment failure rates by 20% through improved communication, process optimization, and proactive issue resolution.
  • Enhanced data security and system resiliency by leveraging expertise in security, Disaster Recovery, Data Replication, and SSL implementation. Implemented robust security measures, including data encryption and access controls, and designed and tested disaster recovery procedures, reducing potential data loss by an estimated 99% and ensuring business continuity.
  • Led the transformation and migration of existing data to new environments and databases, ensuring data integrity and minimal disruption to business operations. Successfully migrated over 400 TB of data with zero data loss and within 9 months timeframe, enabling the decommissioning of legacy systems and cost savings.
  • Collaborated with cross-functional teams to define requirements and develop end-to-end solutions for complex data engineering projects.
  • Evaluated emerging technologies and tools to identify opportunities for enhancing existing systems or creating new ones.

Senior software engineer

HTC Global service
07.2014 - 04.2018
  • Developed and maintained robust data streaming systems and architecture for diverse agro-applications, processing high-velocity data from IoT devices to enable real-time insights and operational efficiency. This initiative supported data-driven decision-making.
  • Engineered and optimized data ingestion pipelines using Apache Nifi to collect raw data from IoT devices and reliably transfer it to Amazon S3 for downstream processing and analysis. This streamlined data acquisition, reducing data latency by 20% and ensuring data availability for critical analytics workflows.
  • Led a proof-of-concept initiative to integrate Apache Kafka and Apache Storm for real-time data processing and aggregation, demonstrating the feasibility of a highly scalable and fault-tolerant architecture for handling streaming data. This PoC laid the foundation for future real-time data analytics capabilities and informed strategic technology decisions.
  • Designed and implemented ETL/ELT processes to load and transform large datasets of structured and semi-structured data, ensuring data quality and consistency for various analytical and reporting purposes. Managed data volumes exceeding [quantify data volume, e.g., "petabytes"] and optimized data pipelines for efficient processing and storage.
  • Provided strategic recommendations on appropriate big data tools and technologies to address evolving data challenges and optimize day-to-day operations. These recommendations influenced technology adoption decisions and contributed to improved data infrastructure efficiency and scalability.

Senior system engineer

iGate
03.2011 - 07.2014
  • Delivered comprehensive end-to-end support for mission-critical AS/400 systems at GE, managing and maintaining a global infrastructure of 110 servers supporting diverse business functions and a large user base. This role honed expertise in system administration, performance optimization, and ensuring high availability for critical business applications.
  • Streamlined server performance and enhanced system stability through proactive maintenance procedures, system upgrades, and performance tuning, resulting in a quantify improvement
  • Spearheaded the design and implementation of AS/400 backup and recovery procedures, ensuring robust data protection and system integrity for critical business data. These procedures minimized the risk of data loss and ensured business continuity in the event of system failures or disasters.
  • Resolved complex system issues and provided expert troubleshooting support, minimizing disruptions to business operations and ensuring timely resolution of critical incidents. Consistently exceeded service level agreements (SLAs) for system availability and issue resolution.
  • Developed and maintained comprehensive system documentation and provided valuable insights and recommendations for continuous improvement, contributing to enhanced system management practices and knowledge sharing within the team.

Education

Bachelor of Technology - Computer Science Engineering

JNTU-A
India
04-2010

Master of Science - Data science

University of Sunderland
Hong Kong
04-2026

Skills

  • Cloud Platforms: Azure (Data Factory, Databricks, Synapse, Cosmos DB, Purview, Azure AI Stack), AWS (S3, Redshift, Lambda)
  • Big Data Technologies: Hadoop (Hive, HBase, Atlas, Ranger), Spark, Kafka, NiFi, Sqoop, Spark Streaming
  • AI/ML & GenAI: AutoGen, LangChain, MLOps, LLMOps, RAG, Ethical AI frameworks
  • Programming Languages: Python, PySpark, SQL, Java, Bash, Shell Scripting
  • Data Tools: Power BI, Informatica, Azure Data Factory, Databricks
  • Databases: Azure SQL, Oracle, DB2, Neo4j, Cosmos DB, Postgres etc
  • Methodologies & Practices: Agile, Data Governance, Data Security
  • Data Management Expertise:
  • Data Management Process – Streamlined workflows for data lifecycle management
  • Data Architecture– Designed scalable and resilient data frameworks
  • Document and Content Management – Implemented systems for efficient document storage and retrieval
  • Data Ethics – Ensured responsible AI and data usage practices
  • Data Governance – Established policies for data consistency and compliance
  • Data Integration and Interoperability – Enabled seamless data exchange across systems
  • Master and Reference Data Management – Maintained authoritative data sources for accuracy
  • Data Modelling and Design – Created robust models to support business analytics
  • Data Quality – Improved data reliability through cleansing and validation
  • Data Security – Protected sensitive data with advanced security measures
  • Data Storage and Operations – Optimized storage solutions for performance and cost
  • Metadata Management – Managed metadata to enhance data discoverability
  • Enterprise architecture design, Security and compliance, Infrastructure automation
  • Soft Skills

  • Stakeholder Engagement
  • Team Collaboration
  • Technical Mentorship
  • Problem Solving
  • Effective Communication
  • Negotiation Skills

Certification

  • Azure certified AI-102: Designing and Implementing a Microsoft Azure AI Solution
  • Databricks Certified Data Engineer Professional
  • Azure Databricks Platform Architect
  • Certified: Data Management Fundamentals Exam: Master Level
  • Microsoft Certified: Azure Solutions Architect Expert
  • Accredited - Confluent Fundamentals for Apache Kafka

Timeline

Senior solution architect

Manulife
08.2023 - Current

Senior data engineer

Manulife
04.2018 - 07.2023

Senior software engineer

HTC Global service
07.2014 - 04.2018

Senior system engineer

iGate
03.2011 - 07.2014

Bachelor of Technology - Computer Science Engineering

JNTU-A

Master of Science - Data science

University of Sunderland
Dinesh Amilineni