There is no news that fintech companies are a real threat to the future of large financial corporations. But how has fintech (financial technology) driven the future of the financial sector, and how have traditional and established corporations responded to these threats?
Fintechs are already running the market for a long time. Then, what is special now? To answer this question, we need to think a bit about the type of products and services being offered by these companies, their target audience, and why it is taking off the sleep of financial institutions.
More than simply develop new systems or upgrade legacy systems for financial institutions, fintechs began to develop and bring to market their own integrated applications and solutions to compete directly with them. They are mobile payment applications, resource transfers, credit card services, and, in some cases, even the opening and maintenance of fully digital checking or saving accounts, which includes services such as P2P loans, mortgages, financing, insurance, etc. In this model, the client no longer needs any physical interaction with the institutions. And by offering their services and a complete customer service environment through the digital world without physical branches, Fintechs can provide their products and services at a fraction of the cost of traditional banking products, such as Nubank with their no annual fee credit card. In addition, banks have not shown the same ability to quickly adapt to changes in digital consumer behaviour, something Fintechs can easily do—quickly and very well.
Something that is also forcing banks to rethink their business models is the increasing regulatory demands. In the face of tighter budgets and being anxious to meet all major standards and market regulations, these institutions are rather looking for ways to improve efficiency and reduce costs. Nevertheless, its heavy processes—and even tied—decision-making process and the consequent implementation of new ideas are turned into an arduous and slow task, bringing another challenge for banks. Thus, the agility of Fintechs with its lean mindset and culture of innovation focus can be highlighted, a strong factor that has counted in favour of its rapid growth.
All that without mentioning the called cryptocurrency, or digital money, which has led to large institutions like Goldman Sachs and JP Morgan initiating global projects to increase the efficiency of trading and settlement of assets.
Inside the Brazilian market, the number of users who own smartphones surpassed the milestone of 76 million in the third quarter of 2015. Most people prefer the speed and convenience of applications available for these devices. Included in this growing number, we have the so-called Generation Z. People who were born and grew up with the World Wide Web (1990) and the explosion and popularization of technological devices by the end of 2010. This demanding and highly digital mass of consumers no longer find themselves satisfied with the traditional model offered by financial institutions and therefore are more willing to try new products and banking services from companies that do not yet have a solid brand recognition, as traditional banks have. This generation of digital natives will make the difference and define which companies will still exist in a future that has already begun.
In 2015, some UK Fintechs earned banking licenses under the agreement of the government and market regulators and were allowed to expand their portfolio of mobile products and services, increasing competitiveness in the sector. Given the speed of technological change and financial services, any financial institution, even if well established in the market and in front of their customers, cannot afford to ignore the threats or opportunities that Fintechs represent.
According to a recent report made by PwC (Blurred Lines: How FinTech is Shaping Financial Services), by 2020, over 20% of the financial services business may be at risk because of emerging FinTechs, so now more than ever, financial institutions need to change their mindset to meet the needs of the digital consumer, integrating the digitization of their processes into their corporate DNA. And according to this same study, the ways to achieve this are: put a FinTech methodology at the centre of its strategy; adopt a mobile-first approach; collaborate with FINTECH companies; and understand the background regulatory challenges.
Modern consumers are increasingly comparing the digital banking experience from your bank with companies like Apple, Amazon, and Google—who were not famous for their banking services but as well as Fintechs have also taken the sleep of many traditional organizations of financial services. Financial institutions that do not define and start a real digital strategy for the upcoming years will face serious problems to remain profitable in this market.
This article was originally published on Linkedin, in April 2016.
More than simply develop new systems or upgrade legacy systems for financial institutions, Fintechs began to develop and bring to market their own integrated applications and solutions to compete directly with them.
Disciplined Agile (DA) is a framework for helping organizations become more agile. It is a hybrid approach that combines agile, lean, and traditional practices to create a flexible and adaptable way of working.
By applying PMI® PMBOK® performance domains in conjunction with principles of project management, a former project manager can successfully perform as a product manager and effectively deliver product outcomes that meet stakeholder expectations.
An API is a set of established routines and standards (software) that allows the consumption of its functionalities by applications that do not intend to know or be involved in implementation aspects of the software itself, but solely use its services. APIs used to be discussed as being machine-readable interfaces (versus human-readable).
Although at first sight API (application programming interface) could be seen as a very technical subject, the main idea of this article is to be business-friendly and show how companies can boost businesses and take advantage of that.
A software application built to attend to an end user used to be designed to make the act of consumption easy, fast, simple, and pleasant.
Software doesn’t have eyes, emotions, or intuition, and then it doesn’t need a user-friendly interface when it comes to the game to deliver service to another application. Nonetheless, as with a UI tailored to humans, software needs an interface that makes it easy to consume data and/or functionality.
An API is a set of established routines and standards (software) that allows the consumption of its functionalities by applications that do not intend to know or be involved in implementation aspects of the software itself but solely use its services. APIs used to be discussed as being machine-readable interfaces (versus human-readable).
A good way of thinking about APIs is in the context of a wall socket. The electrical sockets found in the walls of homes and businesses are essentially interfaces to a service: the electricity that’s consumed by everything from our computers and smartphones to TVs, refrigerators, hair dryers, and several others. The electricity is the service, and each device that uses electricity is a consumer of that service.
Although they may differ depending on where they are in the world, electrical sockets have some patterns of openings into which electrical plugs that match those patterns fit, and the service itself also conforms to certain specifications (e.g., in Brazil, most wall sockets deliver 110 (127, to be very precise) volts of alternating current (AC). This specification essentially sets an expectation on behalf of consuming devices. Likewise, an API specifies how software components should interact with each other.
Through the standard interface, any compatible consumer can easily take advantage of necessary energy to supply its devices, and for the consumer, it simply does not matter what is happening on the other side of the wall, starting from simple things like the colour of the wiring in the walls to more complex “details” like how the electricity is generated (whether it be from a wind farm, nuclear plant, coal-fired generator, hydropower plant, or solar array) or where those sources of power are located.
And in the other way around, the provider does not need to explain or present details of how services are being offered. In a tightly regulated market like financial services, it’s essentially good because data privacy, bank secrecy, and sensitive personal information of accountholders will be kept “behind the walls.”.
The standards for the API will be described and opened to any company or developer interested in using banking services through official documentation usually offered via a Web portal.
In the financial services world, the services could be loans, insurance, e-commerce, payments, customer information, customer transaction history, authentication, wire transfers, and so on, that can potentially be delivered from a wide range of compatible devices (i.e., smartphone apps) wherever it’s required by an end user.
A merchant that today offers their products and services can now offer an added value service in a secure way to their customers, accelerating conversion and avoiding waivers.
In this new business model, a bank can expand their action range and increase revenue, but on the other hand, the exposure of your brand will be lower or inexistent. It’s a business strategy to decide what's worth depending on results expectation.
Getting back to security, APIs are commonly offered in three models: Private (Internal), Public (External) or restricted (Partner). And once again, it depends on company strategy. If data to be exposed are very sensitive, it could be leading you to use a Private or restricted app. Despite of being more profitable, this kind of API will not be easily maintained or enhanced. You will not have the support of developer’s community, likewise Public APIs.
Innovation, costs and security are critical factors to take into account for your decision.
The use of Open APIs fosters a customer-centric ecosystem and open innovation. As of today, more than 15 thousand APIs (in the most variable market segments like financial services, government, mobile, transportation, education, science and others) are available at Programmable Web. Beyond information, guidance and useful reference, the portal also works not only as a directory for companies looking for APIs but also for developers and providers to offer them.
The Open Bank Project is an open source API and App store for banks that empowers financial institutions to securely and rapidly enhance their digital offerings using an ecosystem of 3rd party applications and services.
A very successful case of Open Banking APIs is BBVA API Market. With presence in more than 35 countries and using open and innovative strategies like mobile first, omnichannel, and digital, BBVA has a wide set of products being offered in their portal.
The API economic possibilities are vast. To explore this universe requires experience and technical capacity to undertake such projects and assist companies to take advantages from those benefits.
Published first at Linkedin by September 2016
A IoT (Internet of Things ou Internet das Coisas) é a buzzword do momento. Mas o movimento em torno da tecnologia já vem acontecendo desde 2009 e deve perdurar. Segundo o IDC, o número de dispositivos conectados ao universo IoT será da ordem de 30 bilhões em 2020 e se prevê um mercado potencial estimado em 7,3 bilhões de dólares já em 2017.
A origem do nome Internet das Coisas é atribuída a Kevin Ashton. O termo foi usado em uma apresentação feita por ele em 1999 na empresa Procter & Gamble (P&G). Dez anos depois, em um artigo publicado através do RFID Journal, ele referencia a apresentação e cita o que é tido por muitos como a definição de IoT.
Segundo Ashton, “se tivéssemos computadores que soubessem tudo o que há para saber sobre coisas, usando dados colhidos sem qualquer interação humana, seríamos capazes de monitorar e mensurar tudo, reduzindo o desperdício, as perdas e o custo. Poderíamos de saber quando as coisas precisarão de substituição, reparação ou atualização, e se eles estão na vanguarda ou se tornaram obsoletas”.
Desde então IoT vêm se desenvolvendo e sofrendo mutações com o uso de redes convergentes (principalmente wireless), sistemas microeletromecânicos (Micro-Electro-Mechanical Systems) e a Internet.
As “Coisas” podem ser um monitor cardíaco, um chip transmissor, um localizador, um termômetro, uma câmera de segurança, uma porta, sensores no motor de um carro, um tubarão, enfim qualquer coisa natural ou construída por mãos humanas que possa enviar e/ou receber dados através de uma rede sem fios ou cabeada.
Hoje já se fala também em IoE (Internet of Everything — Internet de Todas as Coisas) e WoT (Web of Things — Web das Coisas).
Academicamente, poderíamos entender IoT e IoE como sendo relacionado à máquinas, sensores, “coisas” que se comunicam e trocam informações entre si através de dispositivos de uma rede de dados (seja ela cabeada ou não), enquanto a WoT referencia softwares, aplicativos e web sites fornecendo e consumindo recursos e informações entre si.
Mas é importante lembrarmos da “definição original”, pois aponta para um componente importantíssimo e que separa o que é e o que não é Internet das Coisas. Ashton usa a frase “sem qualquer interação humana” e é aí que vemos a maior parte dos erros quando dizemos que algo é ou não Internet das Coisas.
Quando me aproximo de casa com meu carro e meu celular com um endereço IP associado se comunica com a porta da garagem e, automaticamente, ela se abre, eu estou em um ambiente de Internet das Coisas.
Quando, ao entrar em casa, o ar condicionado percebe a minha presença e liga automaticamente, inclusive percebendo qual é a temperatura externa para deixar a casa com uma temperatura ambiente agradável, isso também é Internet das Coisas. E, se ainda sem qualquer interação humana, meu aparelho de som ou minha TV são acionados automaticamente, temos a Internet das Coisas.
Mas podemos ir além. Os sensores do ar condicionado podem aprender com meus hábitos e horários e, uma vez sabendo a hora que chegarei em casa do trabalho e a temperatura que mais me agrada, poderia se autoacionar um pouco antes da minha chegada, para que, quando eu chegasse, os cômodos já estivessem na temperatura ideal.
Da mesma forma, o sistema de som poderia aprender que às terças, quintas e sábados eu escuto blues e jazz. Nos outros dias, gosto de escutar clássicos do rock. Aos domingos ele não se acionaria sozinho. A TV, por exemplo, seria o dispositivo que seria acionado aos domingos, inclusive com alertas enviados ao meu celular como lembretes sobre o horário de exibição dos meus programas favoritos.
Saindo do contexto pessoal, o uso da IoT no ambiente empresarial também promove inúmeras facilidades. Imaginem um ambiente com um computador realizando um processamento (um cálculo com números de alta grandeza) sobre plataformas de petróleo. O cálculo levaria um determinado tempo. Bastaria que eu colocasse, por exemplo, um laptop ao lado deste computador e, as máquinas conversariam entre si. Mais ou menos assim:
- Oi! Sou a máquina XYZ e estou fazendo um cálculo. Seu processador está ocioso? Preciso de mais poder de processamento para realização de um cálculo.
Ao que a outra responderia:
- Oi! Sou a máquina ABC. Sim, está ocioso e a partir deste momento estou liberando o uso do meu processador e trabalharemos em cluster para reduzir o tempo de resposta do cálculo solicitado.
E o compartilhamento de recursos de processamento iniciaria automaticamente.
Isso é Internet das Coisas! Um mundo de possibilidades de avanço tecnológico nas mais diversas áreas, tais como médica, automotiva, petrolífera e de serviços, entre outras com uma previsão de crescimento em dispositivos e receita que despertam nosso interesse e ativam nossa imaginação.
Published at Medium by October 2014
Segundo o dicionário Aurélio “Padronização” é o ato de padronizar que por sua vez aponta para unificação de padrões.
Quando escrevi o primeiro texto sobre Internet das Coisas – Das Origens ao Futuro a ideia era apresentar de forma clara e direta a origem e as principais características de um mercado potencial estimado na casa dos bilhões de dólares.
A iniciativa do segundo texto Segurança da Informação no mundo da Internet das Coisas foi exatamente complementar o primeiro apontando para as principais questões de segurança que envolvem este mercado, uma vez que um mercado tão lucrativo certamente atrai também pessoas e / ou entidades mal intencionadas.
Por fim, quando falamos da interconexão de milhões de dispositivos através de uma rede de comunicação, precisamos também pensar nos padrões através dos quais todos estes dispositivos realmente poderão se comunicar e, desta forma permitir a geração do esperado valor para as organizações em conjunto com a adequada segurança das informações.
Parece uma tarefa fácil? Nem tanto.
Enquanto os padrões de redes de comunicação já possuem um alto grau de maturidade (porém em constante desenvolvimento) ainda não há uma padronização definida em termos de dispositivos. Vemos então aqui a necessidade da integração entre as organizações provedoras de Tecnologia e as organizações provedoras dos dispositivos que farão uso da Internet das Coisas.
Os consórcios abaixo se uniram recentemente em uma aliança estratégica para troca de informações e colaboração técnica visando acelerar a entrega de uma estrutura arquitetural de IoT para as indústrias:
Outras iniciativas em andamento incluem:
Enfim, através de um processo global eficiente de padronização em complemento à rígidos aspectos de Segurança da Informação, os benefícios já amplamente conhecidos da Internet das Coisas firmam-se cada vez mais como uma realidade financeiramente vantajosa para as organizações e que efetivamente pode ajudar a melhorar a vida das pessoas.
Published at TI Especialistas by May 2015
Com a rápida expansão da utilização da Internet das Coisas em todo o mundo, além da crescente disseminação de malwares para todo tipo de hardware e software (sejam sistemas operacionais ou aplicativos), a preocupação com a Segurança da Informação (dados pessoais e corporativos) também deve seguir entre as principais prioridades da indústria de Tecnologia da Informação.
A Internet das Coisas traz centenas de milhares de dispositivos trocando informações entre si pela Internet. Informações que, por exemplo, podem ter sido coletadas através de dispositivos ligados ao corpo de um paciente e que pode enviar dados sobre o seu estado de saúde e até resultados de exames para o médico, aonde quer que ele esteja localizado, podendo, inclusive, ver estas informações do seu smartphone. Estes dados podem ser facilmente interceptados, modificados ou utilizados em benefício de quem não detém direito sobre eles.
Da mesma forma, quando afirmamos que a porta da garagem, o ar condicionado ou qualquer outro dispositivo que esteja conectado à rede interna da nossa casa e que pode ser acionados apenas pela presença do smartphone de seu proprietário, também podemos afirmar que todos estes dispositivos estão sujeitos a ação de pessoas mal intencionadas. Um especialista em tecnologia com bons conhecimento em linguagens de programação e protocolos de redes, pode facilmente criar um malware para agir em seu benefício.
Os Malwares são programas especificamente desenvolvidos para executar ações danosas e atividades maliciosas, como por exemplo, obtenção de vantagens financeiras, coleta de informações confidenciais, vandalismo, prática de golpes, realização de ataques e disseminação de spam.
Obviamente quando destacamos as oportunidades de negócios envolvidas com Internet das Coisas, pessoas mal intencionadas também buscarão se aproveitar de alguma forma deste mercado potencial. O que fazer? Pensando no lado dos usuários finais, a prevenção ainda é a melhor prática quando tratamos o tema segurança.
Manter o firewall e os softwares anti malwares atualizados, usar sempre programas originais e atualizados, usar somente fontes confiáveis ou lojas oficiais para download de aplicativos / programas, não acessar informações confidenciais ou realizar transações financeiras usando redes wi-fi públicas, verificar a veracidade e autenticidade de um link antes de clicar sobre ele e ter atenção quanto à autenticidade dos certificados digitais que aparecem no navegador são algumas das medidas de segurança a serem tomadas.
Já pelo lado das empresas e provedores de serviços, a principal mudança está na mentalidade. Estamos preparados para receber estes dispositivos dentro da rede da empresa? Como isolar o tráfego dos usuários e aplicativos do tráfego de dados sensíveis ao negócio da empresa? Como garantir a qualidade dos serviços? Como garantir uma largura de banda suficiente para atender a demanda das “coisas” sem impactar o core business da empresa? Enfim, todos queremos aproveitar as oportunidades que a Internet das Coisas pode proporcionar e que elas sempre possam vir acompanhadas da segurança adequada às informações.
Published at Medium by February 2015
A aplicação do modelo Design Thinking pode trazer os benefícios necessários para o sucesso de um projeto de Transformação Digital.
Estamos falando da aproximação entre consumidores e empresas que permite que o consumidor tenha uma experiência de compra cross-channel.
Introduction
The data volumes are blowing up. More data has been created in the past two years than in the entire previous history of the humankind. Moreover, solely along 2017, a wide and growing range of data coming from all type of industries: financial, IoT, healthcare, automotive, astronomy, biotech, cybersecurity, social media, entertainment, amongst several others, can make even higher these impressive numbers. Nonetheless, researches has found that less than 0.5 percent of that data is actually being analyzed for operational and business decision making.
By the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet.
By then, our accumulated digital universe of data will grow from 4.4 zettabytes today to around 44 zettabytes, or 44 trillion gigabytes.
According to IDC, the number of connected devices within IoT universe will be 30 billion by 2020 and forecast is a potential market estimated in 7.3 billion dollars yet in 2017.
This huge data mass is what everyone is talking about and that with proper handling can bring significant results and changes for all those industries and for human life.
A-Z
Analytics has emerged as a catch-all term for a variety of different business intelligence (BI) – and application-related initiatives. For some, it is the process of analyzing information from a particular domain, such as website analytics. For others, it is applying the breadth of BI capabilities to a specific content area (for example, sales, service, supply chain and so on). In particular, BI vendors use the “analytics” moniker to differentiate their products from the competition. Increasingly, “analytics” is used to describe statistical and mathematical data analysis that clusters, segments, scores and predicts what scenarios are most likely to happen. Whatever the use cases, “analytics” has moved deeper into the business vernacular. Analytics has garnered a burgeoning interest from business and IT professionals looking to exploit huge mounds of internally generated and externally available data.
Artificial Intelligence is the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable.
Big Data is a term that describes the large volume of data – both structured and unstructured – that inundates a business on a day-to-day basis. However, it’s not the amount of data that’s important. It’s what organizations do with the data that matters. Big data can be analyzed for insights that lead to better decisions and strategic business moves.
Business Analytics is one aspect of business intelligence, which is the sum of all your research tools and information infrastructure. Due to this close relationship, the terms business intelligence and business analytics are sometimes used interchangeably. Strictly speaking, business analytics focuses on statistical analysis of the information provided by business intelligence.
Business intelligence (BI) refers to the procedural and technical infrastructure that collects, stores and analyzes the data produced by a company’s activities. Business intelligence is a broad term that encompasses data mining, process analysis, performance benchmarking, descriptive analytics, and so on. Business intelligence is meant to take in all the data being generated by a business and present easy to digest performance measures and trends that will inform management decisions.
Clustering is the task of dividing the population or data points into a number of groups such that data points in the same groups is more similar to other data points in the same group than those in other groups. In simple words, the aim is to segregate groups with similar traits and assign them into clusters.
A business intelligence Dashboard is a data visualization tool that displays the current status of metrics and key performance indicators (KPIs) for an enterprise. Dashboards consolidate and arrange numbers, metrics and sometimes performance scorecards on a single screen. They may be tailored for a specific role and display metrics targeted for a single point of view or department. The essential features of a BI dashboard product include a customizable interface and the ability to pull real-time data from multiple sources.
Data Grids are in-memory distributed databases designed for scalability and fast access to large volumes of data. More than just a distributed caching solution, data grids also offer additional functionality such as map/reduce, querying, processing for streaming data, and transaction capabilities.
Data Mining is the practice of automatically searching large stores of data to discover patterns and trends that go beyond simple analysis. Data mining uses sophisticated mathematical algorithms to segment the data and evaluate the probability of future events. Data mining is also known as Knowledge Discovery in Data (KDD).
Data Science is an interdisciplinary field about scientific methods, processes, and systems to extract knowledge or insights from data in various forms, either structured or unstructured, similar to data mining. It’s a concept to unify statistics, data analysis and their related methods in order to understand and analyze actual phenomena with data. It employs techniques and theories drawn from many fields within the broad areas of mathematics, statistics, information science,, and computer science, in particular from the subdomains of machine learning, classification, cluster analysis, data mining, databases, and visualization.
A Data Warehouse is a storage architecture designed to hold data extracted from transaction systems, operational data stores and external sources. The warehouse then combines that data in an aggregate, summary form suitable for enterprise-wide data analysis and reporting for predefined business needs.
ETL tools perform three functions to move data from one place to another: Extract data from sources such as ERP or CRM applications; Transform that data into a common format that fits with other data in the warehouse; and, Load the data into the data warehouse for analysis.
Microsoft Excel is a software program that allows users to organize, format and calculate data with formulas using a spreadsheet system.
Fact tables are the foundation of the data warehouse. They contain the fundamental measurements of the enterprise, and they are the ultimate target of most data warehouse queries. The real purpose of the fact table is to be the repository of the numeric facts that are observed during the measurement event.
Google Analytics is a Web analytics service that provides statistics and basic analytical tools for search engine optimization (SEO) and marketing purposes.
The Apache Hadoop project develops open-source software library framework for reliable, scalable, distributed computing that allows for the processing of large data sets across clusters of computers using simple programming models. Other Hadoop-related projects at Apache includes Ambari, Hadoop HDFS, Hadoop MapReduce, Hive, HCatalog, HBase, ZooKeeper, Oozie, Pig and Sqoop.
The Apache Hive data warehouse software facilitates reading, writing, and managing large datasets residing in distributed storage using SQL.
Histogram is a graphical representation, similar to a bar chart in structure that organizes a group of data points into user-specified ranges. The histogram condenses a data series into an easily interpreted visual by taking many data points and grouping them into logical ranges or bins.
Julia is a high-level, high-performance dynamic programming language for numerical computing. It provides a sophisticated compiler, distributed parallel execution, numerical accuracy, and an extensive mathematical function library.
K-means is one of the oldest and most commonly used clustering algorithms. It is a prototype based clustering technique defining the prototype in terms of a centroid that is considered to be the mean of a group of points and is applicable to objects in a continuous n-dimensional space.
Keyrus is a specialized company in performance management consulting and the integration of innovative technological solutions in Data (BI, Big Data Analytics, EPM) and Digital (Digital commerce and Omni-channel, eCRM, Web Performance and so on) fields.
Machine Learning is a core subarea of artificial intelligence that studies computer algorithms for learning to do stuff automatically without human intervention or assistance. The learning that is being done is always based on some sort of observations or data, such as examples, direct experience, or instruction. So in general, machine learning is about learning to do better in the future based on what was experienced in the past. Although a subarea of AI, machine learning also intersects broadly with other fields, especially statistics, but also mathematics, physics, theoretical computer science and more.
MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language developed by MathWorks, that allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages, including C, C++, C#, Java, Fortran and Python.
MongoDB is an open-source document-based database system. It’s name derives from the word “humongous” because of the database’s ability to scale up with ease and hold very large amounts of data. MongoDB stores documents in collections within databases.
By Natural Language, we mean a language that is used for everyday communication by humans; languages like English, Hindi or Portuguese. We can understand Natural Language Processing — or NLP for short — in a wide sense to cover any kind of computer manipulation of natural language. Providing more natural human-machine interfaces, and more sophisticated access to stored information, technologies based on NLP are becoming increasingly widespread (e.g. phones and handheld computers support predictive text and handwriting recognition; machine translation allows us to retrieve texts written in Chinese and read them in Spanish; text analysis enables us to detect sentiment in tweets and blogs).
NoSQL (Not Only SQL) describe an approach to database design that implements a key-value store, document store, column store or graph format for data. NoSQL databases especially target large sets of distributed data. Nowadays, there are more than 225 NoSQL databases, mostly addressing some of the points: being non-relational, distributed, open-source and horizontally scalable.
OLAP (Online Analytical Processing) is the technology behind many Business Intelligence (BI) applications. OLAP is a powerful technology for data discovery, including capabilities for limitless report viewing, complex analytical calculations, trend analysis, sophisticated data modeling and predictive “what if” scenario (budget, forecast) planning.
Power BI is a suite of business analytics tools to analyze data and share insights.
Predictive Analytics is the use of statistics and modeling to determine future performance based on current and historical data. Predictive analytics look at patterns in data to determine if those patterns are likely to emerge again, what allows businesses and investors to adjust where they use their resources in order to take advantage of possible future events.
Python is a high-level interpreted programming language. Provides constructs intended to enable writing clear small and large-scale programs. It features a dynamic type system and automatic memory management, supports multiple programming paradigms, including object-oriented, imperative, functional programming, and procedural styles. Interpreters are available for many operating systems, allowing Python code to run on a wide variety of systems.
R is a language and environment for statistical computing and graphics. It is a GNU project and provides a wide variety of statistical (linear and nonlinear modelling, classical statistical tests, time-series analysis, classification, clustering) and graphical techniques, and is highly extensible.
SPSS Statistics is a software package used for logical batched and non-batched statistical analysis. Software name originally stood for Statistical Package for the Social Sciences reflecting the original market, although the software is now popular in other fields being used by used by market researchers, health researchers, survey companies, government, education researchers, marketing organizations, data miners and others. Acquired by IBM in 2009, current versions are officially named IBM SPSS Statistics.
What-If Analysis is the process of changing the values in cells to see how those changes will affect the outcome of formulas on the worksheet. Three kinds of What-If Analysis tools come with MS Excel: Scenarios, Goal Seek, and Data Tables. Scenarios and Data tables take sets of input values and determine possible results. A Data Table works with only one or two variables, but it can accept many different values for those variables. A Scenario can have multiple variables, but it can only accommodate up to 32 values. Goal Seek works differently from Scenarios and Data Tables in that it takes a result and determines possible input values that produce that result.
XML for Analysis (XMLA) is a SOAP-based XML protocol, designed specifically for universal data access to any standard multidimensional data source that can be accessed over an HTTP connection
Now, it’s up to you! Bookmark this page and get back whenever you need!
As métricas de software são fatores essenciais para o sucesso de um projeto de desenvolvimento de software. O reúso de software componentizado tem como premissas básicas a redução de custos e o aumento da produtividade e desempenho.
A transformação digital é mais do que apenas conveniência. É também sobre melhorar a eficiência, reduzir custos e mitigar riscos. Ao adotar novas tecnologias, as instituições financeiras podem simplificar suas operações, melhorar a tomada de decisão e proteger-se contra fraudes.
As the digital landscape continues to expand its horizons, the intersection of technology and art becomes more captivating and inclusive than ever. At the heart of this evolution lies p5.js, a JavaScript library that empowers creative minds of all backgrounds to embark on a journey of coding, design, and artistic expression.
Copyright © 2019 Adriano Balaguer – todos os direitos reservados.
Este site usa cookies. Ao continuar a usando-o, você estará concordando com nosso uso de cookies. / This page use cookies. If you go ahead will be accepting our cookies policy.