Vimal Jose
0 following
There is nothing to show.
Designation:Software Engineering Analyst
Description:Client: Nordic Banking Major Environment: BO 4.0, SAP Dashboards, Oracle 11G • Was involved in planning, requirement analysis and leading the development of some of the reports. • Successfully did advanced performance tuning at universe and report levels with runtime savings of up to 400%. • Spearheaded the continuous improvement of the universes by investigating better than existing solutions. • Led a team of 5 members including colleagues at the same level. • Actively worked with the end user in capturing requirements and finalizing the functional specifications. • Defined the test cases to check the correctness of the data in the reports by comparing them with backend data. • Implemented the drill down hierarchies for multi dimensional analysis as per client requirements. Client: German ERP company Environment: BO 4.0, SAP Dashboards, Infocubes, BEx, HANA • Involved working very closely with the Client Solutions Architect and leading a team of 3. • Designed, developed and modified universes on top of SAP Infocubes. • Developed 7 WebI reports on top of BEx queries and universes built on SAP data providers. • Contributed in all the phases of Software Development Life Cycle (SDLC). • Implemented a new process flow for the design of WebI reports which was highly appreciated by the client. • Prepared the security architecture of the implementation including row level and group wise restrictions. • Developed reports by linking different data providers in a single report, and made all possible incompatible objects compatible by merging the common objects. • Implemented index awareness in the universe to avoid issues with few duplicate labels. • Created master detail reports with dynamic hyperlinks to sub-reports. Client: Finnish Telecom Manufacturer Environment: BOXI 3.1, Oracle 11G, Oracle Streams • Received the ‘ACE’ award, the highest individual award for delivery excellence at Accenture. • Participated in the complete Software Development Life Cycle (SDLC). • Modified and optimized the performance of 4 universes and created 9 reports. • Designed and maintained complex universes, resolved loops and traps by using contexts and aliases. • Implemented aggregate awareness for improving the efficiency of the reports. • Improved consistency and reduced code with the use of @functions (@aggregate aware, @select, @where etc). • Modified complex PL/SQL packages for data transformation in Oracle streams replication. • Configured Oracle Streams for ETL and tested the data for correctness and concurrency. • Used index awareness for reducing the load time of prompts in a large volume table. • Was involved in the performance testing for the ETL and reports before moving to production. Client: Indonesian Telecom company Environment: BO 4.0, SAP Dashboards, Oracle 11I, BEx ,QaaWS • Developed 8 reports and supported the development of 5 universes on top of SAP Infocubes. • Used BIWS and QaaWS connections to develop an innovative approach for making dashboards schedulable. • Contributed to Accenture Asset Library on ‘Scheduling dashboards’ which was used by 10+ projects. • Defined aliases and contexts to resolve loops and also tested them for correct data retrieval. • Created and modified existing hierarchies in the universes to meet the drill analysis requirements. • Worked extensively with BO functionalities such as Breaks, Sections, Ranks, Variables, Alerters, Filters and Sorts. • Adopted strategies for LOVs involving customizing the LOVs, Cascading LOVs and exporting LOVs with universe. • One of the five most awarded new joiners of 2011 at Accenture, Chennai. (Performance points wise).
Designation:Associate Consultant
Description:Client: British Energy Major Environment: R, BO 4.1, Tableau, SQL Server 2014, Oracle 11G • Shadowed the advanced analytics team, out of personal interest, and completed data cleansing and Exploratory Data Analysis for more than 7 assignments. • Gained expertise in Regression models (Linear and Logistic), Residual Analysis, Regularization techniques and Bias-Variance trade-off. • Successfully developed classification models with Naïve Bayes and Random Forests with F-Scores > 0.75 • Was part of the end-to-end project planning and worked through all phases of SDLC for reports and universes. • Was involved in the project planning, analysis, design and the development of 20+ universes and 100+ reports. • Working closely with the clients and highly appreciated by the client for pro-active problem solving skills. • Leading a team of 6 members and conducted trainings to cross train SQL developers into BO. • Working closely with Tableau Team, Database team and the BO team to identify where to place business rules. • Successfully delivered a project which had to be completed in ~65% of the estimates due to an unforeseen circumstance by leading the team in agile mode and by detailed delegation of tasks. • Successfully introduced automation of the testing of universes by guiding a VBA developer from bench for adapting the freely available VBA code to suite our requirements. • 7 appreciations in a single year from clients, managers and team members and rated with the top rating.
Company:Dell EMC
Description:Client: Qatar based Telecom major Environment: Python, R, SQL, SAP Business Objects, Tableau, Greenplum • Was part of requirements phase itself - Travelled to client place, met the stake holders including Country Heads and successfully converted user requirements to clear data science and analytics use cases. • Built clustering and classification models to determine the upsell and cross sell opportunities in mobile packs using Kernel SVM (RBF kernel gave the best accuracy, we tried Random Forest and XGBoost as well) • Completed a thorough Exploratory Data Analysis and Inferential Statistics for the CDR data for use in models. • Successfully led the requirement gathering, analysis, development and testing of BI reports in Tableau and BO • Designed and developed the semantic layers for analytics and Data warehouse Modelling. • Built a collaborative filtering model for a recommendation engine for personalized marketing which improved the purchase rates by 11% and with an intent accuracy of more than 40%, over a span of 3 months. • Used A/B testing to check the efficiency of one of the UX designs over the existing design. • Built a proof of concept on Restricted Boltzman Machines for recommendation engines. • Led the team of 8 in developing PostGRESQL functions in green plum for ETL (Massively Parallel Processing DB) Client: Leading Bank in APJ Environment: Python, R, SQL, Tableau, Spark, Hive • Successfully built Generalized Sequential Pattern (GSP) and Apriori models for IVR optimization to achieve ~30% faster response times using a backend of Hive External tables. • Architected and designed the data pipelines for Data Science as well as BI reporting. • Designed the dimensional model for reporting as well as created Tableau and SAP BO reports and dashboards for performance analysis at multiple granularities. • Successfully did data cleansing and wrangling on IVR data using tidyverse packages including outlier identification and imputations for Customer Call data. Internal: Machine Learning Bootcamp Environment: R • Was 1 of the 3 trainers who conducted Machine Learning Bootcamp for the whole Data Engineering Division • We covered topics from basics of R programming to algorithms like Naïve Bayes, K-Means Clustering, KNN Classification, Dimensionality reduction techniques like PCA, LDA and t-SNE and time series analysis techniques like Holt-Winters and ARIMA variants and advanced techniques like DBSCAN and Expectation Maximization (Gaussian) clustering methods. • We also discussed statistical concepts like Correlation analysis, ANOVA, pair-wise t-tests, z-scores, confidence intervals, skewed data, unbalanced classes and multicollinearity. • This was very well received and we received appreciation from APJ heads for the effort. R&D: Security Analytics Environment: Python, R, Tableau • Modelled ARIMA models to detect anomalous behavior in networks by analyzing domain controller log data Used AdaBoost, Random Forests and Model stacking for correctly classifying user behavior after aggregating over a weekly time span. • Used statistical tests to estimate the significance of the readings from two different domain controllers. Client: EMC Product Team (VX-Rail) Environment: TensorFlow, Python, R, Tableau • Forecasted the internal metrics using incremental learning using ADAM optimizer with TensorFlow (Tried Stochastic Gradient, Nesterov Accelerated Gradient (NAG) also) • Used multi-threaded queue runners to achieve accelerated data loading for out-of-the-core learning. • Improved the speed of predictions by o L2 regularization for generating a sparse matrix (Tried ElasticNet also with multiple r values also) o Downsizing the float variables to 16 bit. o By tuning the dropout parameter so that the neurons that sustain generalizes well. • Also took part in scalably productionizing this model using REST APIs and netis for Message Queueing. R&D: Video Analytics Environment: Python, OpenCV • Participated in the R&D project on constructing an analytics frameworks on videos. • Successfully implemented pedestrian recognition with ~70% accuracy using HOG descriptors and Linear SVM on CCTV feeds. • Generated the heat map based on human activity levels from CCTV feeds, so as the customer could do the placing of the store better. • Implemented a Proof Of Concept for a scalable facial recognition architecture with HOG Descriptors and 128 bit face embeddings using pillow and openface in Python.
Course: others ( SSLC )
University:Akjm hss
Duration: - 2005
Course: others ( Higher Secondary Education )
University:St. dominics hss
Duration: - 2007
Course: others ( Bachelor of Technology (Computer Science) )
University:Amal jyoti college of enginnering
Duration: - 2011
Course: others ( Post Graduate Diploma in Data Science )
Duration: - 2018