Experienced software developer and consultant with a demonstrated history working in the information technology and services industry. I work with C# .NET Core, .NET Framework, .NET 5 for both platform design, algorithms or devices' communication on the IoT Industry. I've worked so far in Industry 4.0, HealthCare, Utilities, Retail, Agriculture, Airlines, Logistics and Public Sectors. I manage and deliver end-to-end solutions using the latest technologies. My main area of expertise is the Internet of Things Industry and Artificial Intelligence.
Fernando Rufo
c/ Lagasca 80 5ÂșA
Madrid, Madrid 28001 Spain.
+34 651 80 23 13
fernando.rufo97@gmail.com
Artificial Intelligence Master's Degree • February 2022
Master's Degree focused on Artificial Intelligence.
Big Data & Analytics Master's Degree• March 2003
Master's Degree focused on Big Data and Analytics. The final project was a predictory analysis for soccer matches correlated to bet houses' prices in order to maximize benefits.
Cloud Engineer || Consultant 1 • January 2022 - Present
Cloud Engineer based in Madrid, developing software for the IoT Industry in .NET Core, .NET Framework and .NET 5 all based on an Azure environment.
Development & Cloud Consultant• September 2021 -January 2022
Cloud consultant in an Azure environment, using .NET Core and .NET 5, Angular and Typescript. Mainly using C#, and microservices in Azure. Docker and Kubernetes are also used in these environments.
Senior Sotfware Developer• October 2021 -September 2022
Platform development involving asset traceability, from storage, production, managing and reporting stock levels to customer delivery tracing it until its received by satisfied customers. These platforms include human resources, logistics, devices or stock levels. These solutions are developed by tools like Visual Studio, in ASP.NET web applications, using C#, Jquery, CSS, bootstrap, HTML and Javascript. I managed the BackEnd and support the FrontEnd development. Also, in the projects that I took part on, we needed to send and receive information from/to the devices, trace this information and show it to the clients. This process is also under Visual Studio and ASP.NET same applications, but in the WEB-Api environment. This information, needs to be as efficient as it can reach to be, but also most of them are critical, as for example one of the projects involves gas cylinders filling. We test the communication process by PostmanApi requests, the info is sent by MQTT protocols using Node-Red and MQTT brokers and topics for each client. In terms of Data Visualization I used Grafana (OpenSource tool for time-series) and PowerBi. The Databases we use were hosted in SqlServer. Project development is hosted in AzureDevOps, Client requests are stored by a Jira platform I personally developed when I first got in the company. Git repository is under Sourcetree and Azure Devops. We work with Agile Methodologies, having daily meetings and talking about problems we faced since the last meeting and made easy also by Asana platform or our Jira application.
Advanced Technologies Consultant || Analyst • October 2019 -April 2020
ETL process by SSIS, using as Database Sql Server. The client was an airline, and we developed ETL solutions for their ERP, managing huge data efficiently. In this project I used Sql Server to query huge volume data daily, in order to obtain posible failures in their files, in order to clean the database from errors and correct posible past solutions from maintenance. Also SSIS was the tool that made possible the ETL process for so big amount of data by process automatization daily. Another project that I took part on, was a Renewable Energy company that was looking for posible investments, I queried and injected data simulating api injection from scratch to test the viability and efficiency of the project in the future. The Database was MongoDb a non relational Db.
SQL Server, SSIS, PowerBi, Grafana, .NET Core, .NET Framework, .NET 5, Typescript, Javascript, Angular, Python, C#, HTML, CSS, Jquery, Node-Red, MQTT Messaging Protocol, Tableau, Azure, Reflection, Datagrid, Postman Api.
Asset traceability and quiality control by sensors and complex algorithms to read data, process it and make decisions.
QC, OptimizationThrough sensors, RFID tags, and a scale we could know the weight, the type of cylinder and its max capacity to control cylinder filling with emergency systems to prevent failures.
Automatization, AlgorithmsBy RFID Tags placed in the assets and key places around the factory the client could know the assets' placement in real time, when it was moved from a place to another and the places it visited before arriving the last one. This optimized too the stock levels, so products don't get devalued due to this.
BrandingThis was ain integration to the assets' traceability platform, o the client could create their own ZPL code based on size, type of tag or the information they wanted to place on it. This evaluated the data and through an algorithm stored the ZPL code, showing the final result by receiving info from ZPL Api.
User Experience, TagInternational client management experience in several projects from all around the worl, great in response time and quality.