sábado, 5 de outubro de 2019

Industry Analyst BARC Recognizes SAP Analytics Cloud | SAP News




AMSTERDAM, Netherlands — SAP SE (NYSE: SAP) today announced that leading industry analyst firm BARC recognized the SAP Analytics Cloud solution with top rankings in both its Planning Survey 19 and Enterprise BI and Analytics Platforms Report. This announcement was made at the SAPinsider conference being held June 25–27 in Amsterdam.
The Planning Survey awarded SAP 31 top rankings and 28 leading positions across five peer groups — more than any other vendor in its peer groups. The survey examined user feedback on planning product selection and use across 28 key performance indicators (KPIs), including business benefits, project success, business value, recommendation, customer satisfaction, user experience, planning functionality and competitiveness. Findings from surveyed users include:
* 100% rate price-to-value for SAP Analytics Cloud as excellent, good or satisfactory (SAP was the only vendor that achieved this result)
* 85% rate predefined planning content in SAP Analytics Cloud as very good, good or satisfactory
* 88% rate functionality in SAP Analytics Cloud for doing simulations as very good, good or satisfactory
The BARC Score: Enterprise BI and Analytics Platforms Report named SAP Analytics Cloud a market leader, placing SAP highest against 19 other competitors on market execution. The report evaluates modern enterprise BI platforms on whether they have a comprehensive toolset and the ability to scale across a variety of industries and use cases. Strengths noted for SAP Analytics Cloud include:
* User-friendly experience for ad hoc reporting, analysis and visual analysis
* Augmented analytics capabilities
* The SAP Analytics Hub solution as single access point for all analytics (cloud and on premise, SAP and non-SAP)
“BARC’s findings validate that SAP Analytics Cloud offers our customers a best-of-breed enterprise planning platform with inherent reporting and augmented analytics to allow collaboration on one connected plan to make fast, confident decisions,” said Gerrit Kazmaier, SVP, SAP HANA and Analytics. “Our modern analytics solution is growing at such a phenomenal rate because we made it easier for our customers to plan, analyze and predict all in one place — unlike other vendors’ solutions there is no need to stitch together and use separate tools for planning, BI and advanced analytics”.
According to BARC, SAP Analytics Cloud achieves a very good set of results in this year’s Planning Survey and Enterprise BI and Analytics Platforms Report. Convincing ratings in numerous important KPIs have helped SAP consolidate its position as a global market-leading planning, BI and analytics platform vendor.
Aço Cearense Group (GAC) is a metallurgical company based in Fortaleza, Brazil. Manuel Robalinho, IT project manager and SAP consultant at GAC, said: “Aço Cearense uses SAP Analytics Cloud to manage and streamline the company’s budget processes and operations to be more efficient. For the first time, thanks to this unique platform, we’re able to spend less time on the budget process and more time on scenario analysis to improve results”.
The latest updates delivered to SAP Analytics Cloud include analytics design and collaborative enterprise planning capabilities for the SAP Business Warehouse application, the SAP BW/4HANA solution and SAP S/4HANA. For more information, read “Announcing the Q2 2019 Release of SAP Analytics Cloud”.
To learn more, check out the SAP Analytics Cloud and The Planning Survey 19 and Enterprise BI and Analytics Platforms Report.
Visit the SAP News Center. Follow SAP on Twitter at @sapnews.

Case de sucesso do Grupo Aço Cearense é apresentado no SAP NOW 2019





Nos dias 11 e 12 de setembro, representantes do Grupo Aço Cearense estiveram em São Paulo participando do SAP NOW 2019. Vinícius Amanajás, gerente de TI, e Manuel Robalinho, especialista em TI do Grupo, apresentaram o case “Inovando o planejamento com o SAP Analytics Cloud”, que demonstra como a solução SAP está refletindo no desenvolvimento de ações inovadoras na questão orçamentária do Grupo.

O SAP NOW é considerado um dos maiores eventos de tecnologia do mercado brasileiro e este ano teve como tema “Gestão da experiência e como a tecnologia tem papel fundamental nisso”, reunindo importantes nomes do mercado nacional e internacional. Foram mais de 16 mil participantes em painéis, plenárias e demonstrações que totalizam mais de 550 sessões. 
http://blogs.opovo.com.br/eshow/2019/09/18/case-de-sucesso-do-grupo-aco-cearense-e-apresentado-no-sap-now-2019/

segunda-feira, 27 de maio de 2019

Multi-Spectral images to Steel Scrap Classification

Steel is the world’s most important engineering and construction material. It is used in every aspect of our lives. There are more than 3,500 different grades of steel with many different physical, chemical, and environmental properties. The European environmental initiative on raw materials (Council, European Parliament and, 2003) has recently promoted the efforts in recycling and recovery of metal alloys. I used spectral images of scrap steel to make an efficient classification using Machine Learning techniques.
The materials used were aluminum, brass, copper, iron, stainless steel, and painted iron materials to obtain multi-spectral images. The used equipment is a multi-spectral machine, ADC Lite. The ANACONDA framework was installed to use Jupyter with Python programming. The exploration of the bands of the images focuses on the analysis of the matrices of the images, in order to obtain classification parameters for the different materials.

Some multi-spectral images from material brass and copper (images from the author)

The classification of scrap metal is a very economically relevant task,
both for society and for the industrial area. Although scrap is a waste society or industrial area, is a valuable product. We also have the issue management of solid waste from metal products at the end of life, which is an objective of all citizens and has specific incentives for some European projects.For the steel industry, the cost of producing steel from scrap is much lower than steel from primary raw materials (iron ore). These companies use the scrap as an alternative to increasing their production volumes, reduce fixed and even variable costs since, in general, scrap is a source of metal economically advantageous, especially in regions where energy costs are very high, for example, in the United States and Europe.
The transformation of the RGB bands of an image into matrices and lists allows an approach with Machine Learning, and the differences between the spectral bands of the different materials can be exploited mathematically. Several libraries available for image processing allow exploration at the level of the representative matrices of an image, its color and the reflectance as exposed to the incidence of light.

Some graphs that use the multi-spectral image information of the scrap metal (images from the author)

The ANACONDA framework was installed for use by Jupyter with programming in Python. The exploration of the bands of the images focuses on the analysis of the images, in order to obtain the classification parameters for the different materials. Other approaches could be to use Google Collaboratory, which provides a virtual machine Python in the cloud and already incorporates an environment with GPU. For applied techniques, it was necessary to install Python libraries, such as: PILLOW, OPENCV, OPENPYXL, COLORMATH, WEBCOLORS, NUMPY and MATPLOTLIB. These libraries provide a set of facilities that allow us to analyze different issues and develop a specific approach to each problem.

An example, correlation graph about some characteristics of copper, obtained using Python. (images from the author)

Conclusions:

It is possible to classify metals using ML techniques, as described. Other technical issues like correct scrap viewing; an environment with rain protection; and light suitable for capturing images, should be observed. In an industrial environment, we have service routines intense and high-speed equipment. High levels of dust, noise, and vibration are common in these environments and need to be considered in a future prototype application.

References

Research is done by the author. Work developed for a Position Paper, university master’s work.
Diretiva 2012/29/UE do Parlamento Europeu e do Conselho, de 25 de outubro de 2012, Council, European Parliament and. (2003). WEEE Directive 2002/96/EC. Eionet — Reporting Obligations Database. Source: http://ec.europa.eu
TETRACAM. ADC Lite. Source: www.tetracam.com

https://medium.com/datadriveninvestor/using-machine-learning-to-scrap-metal-classification-5458eb6ddfc9

sexta-feira, 11 de janeiro de 2019

Working multispectral images

sábado, 5 de janeiro de 2019

What is the difference between SAP ERP and SAP ECC? Is ECC a component of SAP ERP application?




SAP ERP 6.0 is a product offered by SAP which contains business solutions such as Finance, logistics, HR etc and industry solutions such as Oil and Gas, Insurance, Media, Utilities, Retail etc.

These business solutions are enabled by several technical (software) components. A product is broken down (made modular) into software components because a component is designed for re-usability which means that one can use the same component to build a new product. This results in shorter lead times to deliver a new product. However, a software component alone may not be really a “standalone” software component. For example, in case of a car, piston of its engine by itself is of little use. But the engine made out of it is a major component of the car. Therefore, the components are grouped into a super component. This in SAP ERP world is SAP ECC. Much like the engine of a car, its components cannot be installed separately, you have to install all the sub-components in SAP ECC - the entire engine.
Like a car’s engine consists of pistons, shaft, cylinders, SAP ECC (The engine) consists of several sub components such as Logistics and Accounting(SAP_APPL), Financials (SAP_FIN), Human Resources (SAP_HR) etc. and technical platform components (For example, gasoline car vs diesel car) Application server (SAP_BASIS) and Cross application components (SAP_ABA). Then you get industry specific solution components, names starting with IS-; extensions (Enhancements to standard functionality), component names starting with EA-, so on and so forth. You can explore these sub-components after having logged on to SAP by going to menu option system > status and clicking details button next to the ECC component.
In the image above, you can see that the complete product SAP ERP 6.0 can be configured with several (super) software components but SAP ECC 6.0 is a mandatory one. This requires a minimum of SAP Netweaver 2004 to run.
By Amiya Shrivastava, ABAP Development Professional
https://www.quora.com/profile/Amiya-Shrivastava



domingo, 9 de dezembro de 2018

Deep Learning wih PyTorch



Deep Learning wih PyTorch

Using MNIST Datasets

PyTorch is an open-source machine learning library for Python, based on Torch, used for applications such as natural language processing. It is primarily developed by Facebook's artificial-intelligence research group, and Uber's "Pyro" software for probabilistic programming is built on it.
The MNIST dataset
The MNIST dataset was constructed from two datasets of the US National Institute of Standards and Technology (NIST). The training set consists of handwritten digits from 250 different people, 50 percent high school students, and 50 percent employees from the Census Bureau. Note that the test set contains handwritten digits from different people following the same split.
The MNIST dataset is publicly available at http://yann.lecun.com/exdb/mnist/ and consists of the following four parts:
- Training set images: train-images-idx3-ubyte.gz (9.9 MB, 47 MB unzipped, and 60,000 samples)
- Training set labels: train-labels-idx1-ubyte.gz (29 KB, 60 KB unzipped, and 60,000 labels)
- Test set images: t10k-images-idx3-ubyte.gz (1.6 MB, 7.8 MB, unzipped and 10,000 samples)
- Test set labels: t10k-labels-idx1-ubyte.gz (5 KB, 10 KB unzipped, and 10,000 labels)
PyTorch provides two high-level features:
a) Tensor computation (like NumPy) with strong GPU acceleration
b) Deep Neural Networks built on a tape-based autodiff system
To keep things short:
PyTorch consists of 4 main packages:
torch: a general purpose array library similar to Numpy that can do computations on GPU when the tensor type is cast to (torch.cuda.TensorFloat)
torch.autograd: a package for building a computational graph and automatically obtaining gradients
torch.nn: a neural net library with common layers and cost functions
torch.optim: an optimization package with common optimization algorithms like SGD,Adam, etc

PyTorch Tensors

In terms of programming, Tensors can simply be considered multidimensional arrays. Tensors in PyTorch are similar to NumPy arrays, with the addition being that Tensors can also be used on a GPU that supports CUDA. PyTorch supports various types of Tensors.

Look for development details on my GitHub.

References:

My GitHub: 

sábado, 8 de dezembro de 2018

History of the Web



Sir Tim Berners-Lee is a British computer scientist. He was born in London, and his parents were early computer scientists, working on one of the earliest computers.
Growing up, Sir Tim was interested in trains and had a model railway in his bedroom. He recalls:
“I made some electronic gadgets to control the trains. Then I ended up getting more interested in electronics than trains. Later on, when I was in college I made a computeout of an old television set.”
After graduating from Oxford University, Berners-Lee became a software engineer at CERN, the large particle physics laboratory near Geneva, Switzerland. Scientists come from all over the world to use its accelerators, but Sir Tim noticed that they were having difficulty sharing information.
“In those days, there was different information on different computers, but you had to log on to different computers to get at it. Also, sometimes you had to learn a different program on each computer. Often it was just easier to go and ask people when they were having coffee…”, Tim says.
Tim thought he saw a way to solve this problem – one that he could see could also have much broader applications. Already, millions of computers were being connected together through the fast-developing internet and Berners-Lee realised they could share information by exploiting an emerging technology called hypertext.
In March 1989, Tim laid out his vision for what would become the web in a document called “Information Management: A Proposal”. Believe it or not, Tim’s initial proposal was not immediately accepted. In fact, his boss at the time, Mike Sendall, noted the words “Vague but exciting” on the cover. The web was never an official CERN project, but Mike managed to give Tim time to work on it in September 1990. He began work using aNeXT computer, one of Steve Jobs’ early products.

https://webfoundation.org/about/vision/history-of-the-web/

segunda-feira, 3 de dezembro de 2018

Worldwide Steel Production with Machine Learning


The objective of this work is to analyze the production of iron and steel using machine learning. The data was obtained from web sites of the specialty and gives a greater emphasis to the production in South America and Brazil in particular.
The information was collected on the websites:
The information about year 2018, is real information from January to October 2018 and is projected to November and December 2018, because we are ending November 2018.
This work also trains the use of interactive maps of the folium package, to present the statistics. I used the Collaborative Jupyter Notebook, from Google, to made this work. Complete python work is in Github.
The data sources to create graphs are in Github (folder data) in excel format.
Let’s go work.
Because I am in collaborative jupyter, I read the files to the platform with code:



Need install package xlrd to read excel files.



Files to read:





Some read tables from excel:



Table with Latin America production:



I created a file with geo-coordinates from latin america countries, to plot stats in a map:



Printing Graphs:






Steel Production By Region




The graph shows a decline in iron and steel production in 2018 in all markets.
Making a Sum of all markets:




Latin America Production





Creating Maps with Folium package

I made a merge with the Latin America table with the table with geographic data:




We need install folium package, to create interactive maps:





I create a new column in the dataframe to make the tootip that i want to present un the flag-mark.





To print the stats in the map, I used the code bellow. I used the dataframe to pass the coordinates to the map.





The system creates a beautiful map. When we click in the mark-flag the system presents the name of the country and it’s steel production.

Steel Products from Brazil

Because i’m in Brazil, let’s go watch what’s up here.



I have many informations by year, about production and sales. I make separate graphs to explore the information.





Creating subsets dataframes to make prints, in that case about production by year:





In this chart we see the same trend, that production in 2018 is lower than in 2017, in all types of steel products.
Creating a subset dataframe with Brazil Steel Sales:





Sales in 2018 are lower than in 2017, both in the domestic market and in exports.

Conclusion:

At the conclusion of this work, there is the knowledge about the steel and iron production market, the best knowledge in the Brazilian market. Printing on an interactive geographic map will be an excellent tool for presenting future work on websites.
Original article on Medium:
https://medium.com/datadriveninvestor/worldwide-steel-production-with-machine-learning-7796b423e2ea