TIBCO Architect

1 week ago


Riyadh, Ar Riyāḑ, Saudi Arabia Quikr Full time $1,200,000 - $2,000,000 per year

Job Role: TIBCO Architect/SME (Subject Matter Expert)

Experience required: 8+ years

Job-Location: Riyadh, Saudi Arabia

We are seeking an experienced
Data Engineer – SME / TIBCO Architect
to lead the design, development, and optimization of enterprise-level data integration and messaging solutions. The ideal candidate will have deep expertise in
TIBCO technologies
(BusinessWorks, EMS, BWCE, BE, API management, etc.), strong data engineering skills, and a proven ability to architect scalable, secure, high-performance integration ecosystems.

This role combines technical leadership, solution architecture, and hands-on engineering to support mission-critical business processes.

  1. Key Responsibilities
  2. Architecture & Design
  3. Architect end-to-end TIBCO-based integration and data engineering solutions.
  4. Design scalable, fault-tolerant, event-driven, and service-oriented architectures.
  5. Develop integration patterns using TIBCO BusinessWorks, EMS, FTL, BWCE, and APIX/ Mashery.
  6. Evaluate existing systems and define modernization roadmaps (cloud, microservices, containerization).
  7. Data Engineering
  8. Design and build robust data pipelines for structured, semi-structured, and unstructured data.
  9. Implement ETL/ELT workflows, streaming pipelines, and real-time data processing.
  10. Optimize data flows for performance, reliability, and cost efficiency.
  11. Integrate data solutions with cloud platforms (AWS, Azure, GCP) and modern data stacks.
  12. Technical Leadership
  13. Serve as the subject matter expert for TIBCO technologies and enterprise integration strategy.
  14. Provide technical guidance, best practices, and mentorship to engineering teams.
  15. Collaborate with business stakeholders, architects, and developers to translate requirements into technical solutions.
  16. Lead design reviews, performance tuning, troubleshooting, and production support.
  17. Development & Implementation
  18. Build and configure TIBCO components including BW/BWCE, EMS, RV, BE, ActiveSpaces, and TIBCO Adapters.
  19. Develop APIs, orchestrations, and microservices using TIBCO and complementary technologies.
  20. Implement CI/CD pipelines, infrastructure as code, and automated deployment frameworks.
  21. Ensure compliance with enterprise security, data governance, and DevOps standards.
  22. Support & Optimization
  23. Monitor integration ecosystem performance and identify opportunities for improvements.
  24. Lead incident analysis, root-cause investigations, and system remediation.
  25. Optimize messaging, data flows, and API performance for high-throughput applications.
  26. Required Skills & Qualifications

  27. Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or related field.

  28. 10+ years
    of experience in enterprise integration or data engineering roles.
  29. 7+ years of hands-on experience with TIBCO technologies
    , including:
  30. BusinessWorks / BWCE
  31. EMS / FTL
  32. TIBCO BE
  33. TIBCO Adapters
  34. Mashery / APIX
  35. ActiveMatrix / SFG (optional)
  36. Strong expertise in
    data engineering
    :
  37. ETL/ELT, streaming pipelines, and data modeling
  38. SQL/NoSQL databases (Oracle, Postgres, Mongo, Cassandra, etc.)
  39. Big Data tools (Hadoop, Spark, Kafka, Flink)
  40. Strong proficiency in Java, Python, or Go, and scripting languages (Shell, Groovy).
  41. Experience with Docker, Kubernetes, CI/CD pipelines, and
    Git-based workflows.
  42. Cloud experience (AWS / Azure / GCP) for data integration, messaging, and APIs.
  43. Deep understanding of integration patterns, SOA, microservices, and event-driven architecture.
  44. Excellent communication, documentation, and stakeholder management skills.

  45. Preferred Qualifications (Not mandatory)

  46. TIBCO certifications (BusinessWorks, EMS, FTL, API Management).

  47. Experience migrating from legacy TIBCO stacks to cloud-native environments.
  48. Exposure to Snowflake, Databricks, or cloud data warehousing platforms.
  49. Knowledge of MDM, Data Governance, and metadata management tools.
  50. Experience in Agile/Scrum methodology.