EcoOnline is one of the fastest growing software companies in Europe. We develop SaaS (Software as a Service) solutions for all types of businesses with the goal of preventing human and environmental damage in health, safety, environment and quality (HSEQ). This way, we help save lives and take care of the environment. In addition, we offer software solutions that help our customers create safe, sustainable, efficient and attractive jobs.
More than 6,300 companies and 80 industries based in Northern Europe have chosen EcoOnline, as a supplier of user-friendly SaaS tools within HSEQ. We have a rapid growth and today we are 350 talented colleagues in our offices in Tønsberg & Oslo in Norway, Gothenburg in Sweden, Espoo in Finland, Aarhus in Denmark, Dublin & Limerick in Ireland and Birmingham & Liverpool in the United Kingdom. Our vision is to be the preferred provider of HSEQ tools & services, and deliver sustainable results for our customers around the globe.
To continue to grow, we are now looking for a new colleague. As a Data Warehouse Architect at EcoOnline, you will be an important part of the company's further growth, and you will gain valuable experience and a professional network.
Bringing big data together with other data systems can yield amazing insights, particularly when presented in ways that users can see information and make decisions that affect the business. As new technologies like Hadoop and R enable cheaper distributed processing and improved analytical capabilities, the bar from getting insights from new types of data, as well as ever increasing volumes of data, has come down.
The Data Warehouse Architect role drives customer initiatives, leveraging data services to solve the biggest and most complex data challenges faced by EcoOnline's customers. This is a technical role, accountable for the end-to-end deployment and usage for data services.
The Data Warehouse Architect own’s the Data Warehouse Services technical architecture: data architecture design sessions, implementation projects and/or Proofs of Concepts. The ideal candidate will have experience in customer facing roles and success leading deep technical architecture and design discussions with senior executives.
· Interface with delivery teams to design data services solutions in support of the over-arching analytics services solution, including SQL Database, Document DB, SQL Data Warehouse, HD Insight, Azure Machine Learning, Stream Analytics, Data Factory, Event Hubs and Notification Hubs
· Architect scalable data processing and analytics solutions, including technical feasibility for Big Data storage, processing and consumption e.g., development of enterprise Data Lake strategy, heterogeneous data management, Polyglot Persistence, decision support over Data Lake
· Design, coordinate and execute pilots, prototypes or proof of concepts, provide validation on specific scenarios. Report on progress of business objectives; Ensure plan execution, Document and share technical best practices / insights with Engineering teams and the solution architect community
· 5+ years of experience with deep understanding in both traditional and modern data architecture and processing concepts, including relational databases (e.g., SQL Server, MySQL, Oracle), Data warehousing, big data (Hadoop, Spark, Storm), noSQL, and business analytics
· 5+ years of success in consultative/complex technical deployment projects (where necessary, managing various stakeholder relationships to get consensus on solutions)
· Understanding of big data use-cases and Hadoop-based design patterns Knowledge of real time/stream analytics trends
· 5+ years of architecture, design, implementation, and/or support of complex application architectures (i.e. having an architectural sense for connecting data sources, data visualisation, structured and unstructured data, etc.)
· Demonstrable hands on experience implementing Big Data solutions using Microsoft Data Platform and Azure Data Services
· Operationalising end-to-end cloud analytics solutions
· Create a data factory, orchestrate data processing activities in a data-driven workflow, monitor and manage the data factory, move, transform and analyse data
· Create and manage experiments, determine when to pre-process or train inside Machine Learning Studio, select input/output types, apply custom processing steps with R and Python, publish web services
· Design big data real-time processing solutions Ingest data for real-time processing, design and provision compute resources, design for lambda architecture, design for real-time processing
· Deep technical experience in one or more of the following areas: Software design or development, Application Design, Systems Operations / Management, Database architecture, Virtualization, IP Networking, Storage, IT Security
· Working knowledge with AGILE development, SCRUM and Application Lifecycle Management
· Prior work experience in a DBA/Consulting/Architecture position within a software and/or services company such as Amazon, Google, IBM, Softlayer, Oracle, T-Systems, Wipro, CSC, HP, Infosys, ServiceNow, Dell, TCS, Rackspace, Cognizant
· Proficient with Azure cloud computing including big data technologies.
· Azure SQL DB/DW, HD Insight, Azure Data Lake Storage, Azure Data Lake Analytics, Azure Machine Learning, Stream Analytics, Azure Data Factory, and CosmosDB.
· Desired certifications and accreditations preferably one or more of below or comparable:
· Microsoft Azure Designing and Implementing Big Data Analytics Solutions (70-475)
· Microsoft Azure Designing and Implementing Cloud Data Platform Solutions (70-473)
· Microsoft Professional Program Certificate in Data Science
· Hortonworks (Associate, HDPCD)
· Experience of hybrid and/or cloud architectures that utilise Azure
· Experience in or exposure to solutions that are analytical in nature, employing data centric architectures
· Data technology literate with a detailed understanding of data architecture and data tools and technologies, in memory/in database and open source technology.
· Experience of advanced data analytics and insights techniques (eg, predictive, segmentation, recommendation and sentiment analytics, machine learning techniques and leveraging structured and unstructured data)
· Experience working directly within teams developing solutions based on statistics, machine learning, regression models and optimization techniques to distil meaningful business insight from data
Knowledge and/or experience in one of the following:
· C#, R, Python, SQL, PL/SQL, MDX, JSON, XML, .NET, C++, T-SQL, Java, JSON, PHP
· Hadoop (Hortonworks or Cloudera), Spark, Storm, Pig/Hive
· NoSQL, DocumentDb, MongoDB
· PL/SQL Developer, SQL Query Analyzer, R Studio, PowerBI, Tableau, Spotfire, etc, Visual Studio, TFS
· Technical BS degree, Computer Science or Math background highly desired