Friday 31 January 2020

Complete your AI puzzle with inference

IBM Prep, IBM Guides, IBM Learning, IBM Tutorial and Materials, IBM Certifications

Artificial intelligence is complex, and there are multiple ways to approach an AI initiative. I like to think of AI as a jigsaw puzzle. There are multiple pieces (AI phases) that are dependent on each other in order to realize the picture, and with every piece you get closer to the end goal (business insights). Jigsaw puzzles are hardly as simple as they seem and seeing the overwhelming number of pieces thrown together in the box can be daunting. AI is similar. Whether AI or puzzles, it’s easier if you have a map to guide you.

Data, Train, Inference (DTI) workflow: the three pieces of the AI puzzle


AI is exciting. Many organizations are embarking on AI journeys for their first time, and it isn’t always simple. Last year, to help guide clients, the Data-Train-Inference (DTI) model was introduced. Before diving into specifics, it is important to know that the DTI model is not a linear workflow. It is a continuous loop consisting of three stages that interact at all times. And because the process is ongoing and continually being refined, the insights extracted are rich and valuable.

So, what are the pieces of the AI puzzle? As its name suggests, the DTI model has three main puzzle pieces: Data, Train, and Inference.

◉ Data – the build-a-solid-foundation piece. Not starting with a solid data foundation will send you down the wrong path before you’ve even started.

◉ Training – the piece where the magic of artificial intelligence occurs; where data becomes AI models.

◉ Inference – the piece that is really the sum of all the parts. Without proper inference, all prior efforts are for naught.

To complete your AI puzzle and reach success, it is necessary to have the right pieces of infrastructure to support each stage in the AI workflow. So how does the DTI model map to the IBM Power Systems hardware portfolio? For training, there’s IBM Power System AC922 packed with up to six NVIDIA V100 Tensor Core GPUs, engineered to be the most powerful training platform. But what about inference and data?

Meet the IBM Power System IC922 – a new inference server for your AI lineup


Today, I am pleased to share with you the new IBM Power System IC922, a purpose-built inference server designed to put your AI models to work and help you unlock business insights. Accelerated, Power IC922 supports up to six NVIDIA T4 Tensor Core GPU Accelerators and IBM intends to support up to eight NVIDIA T4 Tensor Core GPU accelerators and additional accelerator options.  This allows clients to leverage the inference accelerator that best suit their needs with flexibility. Power IC922 uses optimized hardware and AI software to deliver the necessary components for AI inference whether in a central data center or in a distributed data center closer to the sources of data. The Power IC922 is modular and can scale with business needs.
­­

Storage-rich with 24 SAS/SATA storage bays (future NVMe support intended), Power IC922 also fits neatly into the Data phase to help clients build a solid data foundation. Additionally, leveraging the advanced IO architecture and data throughput in Power IC922 can allow for rapid response requests needed in AI puzzles. Power IC922 is configurable, so whether you are looking for a data server or an inference server, Power IC922 offers a needed piece to your AI puzzle.

To showcase how the IC922 fits into the AI puzzle, the Department of Defense High Performance Computing Modernization Program (HPCMP) recently demonstrated how the IC922 and AC922 could be combined into a modular computing platform, creating an IBM POWER9-based supercomputer in a shipping container. This modular computing capability, initially installed at the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory DoD Supercomputing Resource Center, will enable the DoD to redefine the term “edge” to include deployment of an AI supercomputing capability anywhere in the world, including the battlefield.

Source: ibm.com

Thursday 30 January 2020

Digital alchemy – Turning data into gold

IBM Tutorials and Materials, IBM Guides, IBM Learning, IBM Prep, IBM Certification

Data is the means by which enterprises can truly understand their customers – their needs, behaviors and buying patterns. And within an enterprise’s data lies the potential for greater revenue, increased efficiencies, reduced costs.

This race to unlock that potential is the most significant play in business today.

As technology continues to evolve – and data moves to the cloud via the ever-expanding choice of operating systems, databases, platforms and computing models – the challenge of ensuring that this information is not only safe but useable presents a significant challenge. At the same time, the volume of data is growing exponentially. Governance and sovereignty issues weigh heavily. And IT budgets are either flat or declining.

At Predatar (formerly Silverstring), our primary goal is to protect data. But we believe customers need more than simple insurance against loss. Refining managed data into a reusable asset represents the next great business opportunity, and our aim is to ensure our customers’ data defense is robust and ready for them to turn it into data offense, via a unique combination of automation and smart people with the right skills and capabilities.

Data has been called the “new oil,” but we believe that by refining and repurposing it, data has the potential to be even more valuable.

The true value of data


To help create this “digital gold,” we developed our Alchemis Protect solution, which delivers to our customers the confidence of knowing their data is more secure and available, no matter where it resides. Based on IBM® Cloud™ and utilizing IBM Spectrum® Protect, our as-a-service offering gives customers the opportunity to transform their backup data into a useable business asset.

Our solution is rooted in IBM technology for a simple reason – with IBM, partners like Predatar gain access to a rich and varied portfolio from which we can pick and choose the offerings that form the foundation of our solution. The cross-fertilization of technology and expertise that IBM brings, gives us the confidence that every element will play together effectively.

But the technology is just the beginning. Alchemis Protect combines security intelligence with machine learning and the expertise of our team of skilled data specialists. Together, we are able to analyze historical data trends in data integrity to identify potential trouble areas where customers may experience incidents of non-recovery. Then, we help them to plug those weaknesses. We call it intelligent recovery.

With data being the key to an enterprise’s fortune, you cannot afford to let a single bit slip through your fingers.

Source: ibm.com

Wednesday 29 January 2020

Tech Data’s new Tech Labs put the power of AI in the hands of partners

IBM Tutorials and Materials, IBM Prep, IBM Learning, IBM Certifications, IBM AI


The use of technology to support quicker and more informed decision-making is fundamentally changing the way we do business. Our customers expect faster, better and more innovative services. Our shipping and distribution channels need to be streamlined, eliminating waste and delivering everything just-in-time. Our boardrooms demand richer insight into performance metrics—and they want them now.

This acceleration of business—and knowledge—is why augmented intelligence (AI) has become one of the hottest topics in industry today.

However, here at Tech Data Italia, we have seen a relatively slow uptake of AI in the channel market. So, to help our partners understand the potential of this game-changing technology, earlier this year, we opened a permanent center of expertise for AI in Milan. We call it the Tech Lab and with it, we aim to help partners and their clients take full advantage of AI, in partnership with IBM and Red Hat.

Experimenting with innovation


The Tech Lab aspires to be the platform on which our partners can develop new skills by getting hands-on access—either in person or remotely—to a series of solutions and platforms that can be used to build innovative AI offerings. We work closely with these partners to develop new use cases and turnkey solutions for integration. And we also try to create an environment in which they can develop innovative client value propositions.

The lab space is open to Tech Data resellers who, in turn, can invite partners and clients for individual or group exploration and technical and business development. And for those that can’t make it to one of our lab locations, we make the technology available for remote use at our reseller or client sites. Throughout this process, our experts remain available to provide dedicated technical, sales and marketing support.

Our work at these labs is about helping our resellers and their clients accelerate their AI journey. And these efforts are based on the most advanced technologies from Red Hat and IBM, including IBM® Watson® and IBM Power® Systems.

Collaboration in practice


Tech Data is one of the first distributors in Italy to offer this kind of sophisticated AI facility. It’s a prime example of the power of collaboration between IBM and Tech Data in Italy and around the world, and it is a significant step forward for us in transforming the channel AI opportunity.

Since we launched the Tech Lab in February 2019, a large number of our partners have made great use of the facility to demonstrate the value that collaboration in AI can unlock for their partners.

◉ Plurimedia joined the IBM Partner Ecosystem relatively recently. The business specializes in developing hybrid cloud, IoT and AI solutions. And Plurimedia cites the principles of open source as being key to the transformative power of cloud technologies. Further, the company is passionate not only about cloud being a technology disruptor but also about the impact it can have on every aspect of business.

◉ Marketing MultiMedia Group (MMM Group) is a leading provider of digital advertising based on machine learning. We are partnering with the MMM Group, IBM and the Politecnico di Milano on creating innovative marketing campaigns that combine technology along with commercial and human expertise to create a new generation of advanced marketing possibilities.

◉ Relatech has been an IBM Business Partner since 2014. A specialist in AI development and integration, the company believes that the real value in our partnership is based on the powerful combination of IBM technology, Tech Data’s commercial expertise and its own deep technical expertise.

◉ Aptus.ai has demonstrated an impressive competence in various AI fields, including a solution that aids cancer detection using image recognition. It is currently working with IBM and Tech Data in order to enhance its existing business offerings.

For Tech Data and all our partners, it is the power of partnership which enables us to go beyond the limits of our own individual capabilities. Working as part of this next generation ecosystem enables each player to capitalize on the expertise, market insights and technical capabilities from across our market.

Source: ibm.com

Tuesday 28 January 2020

The road to cloud just got a lot less rocky

IBM Study Material, IBM Guides, IBM Learning, IBM Certification, IBM Certifications

For a technology that’s supposed to make everything simpler, cloud can get a little complicated.

Private. Public. Hybrid. Multicloud. XaaS. Kubernetes. There’s a lot to unpack and explain any time the conversation turns to cloud. And since the companies in the TIMETOACT Group are specialists in various kinds of cloud implementation, we have those types of conversations all of the time.

Luckily, this past summer with IBM’s acquisition of Red Hat, those discussions became a lot easier to have with our customers.

Shifting the conversation


Since we began working with IBM® Cloud™ Private in 2017, we’ve been having conversations with many of our clients about the respective capabilities of private and multicloud environments and about why moving to open technology offers a significant advantage in avoiding vendor lock-in.

So when the Red Hat acquisition went through and the IBM Cloud Pak™ line was formally announced, we were already in a strong position to accelerate to market with the technology. We quickly moved the conversations we were having with our clients to OpenShift®, initially as a platform in its own right — with its market leading hybrid multi-cloud and containerization capabilities — and then highlighting the added possibilities presented by IBM Cloud Paks.

From identity management through to blockchain, the whole range is very much on our radar. Put simply, if we’re not working with a particular IBM Cloud Pak now, we can see very strong reasons to do so in the future.

Opening up a new approach


What’s different about Cloud Paks is the ease with which clients can understand the potential benefits to their business.

Talking about cloud invariably generates interest. But talking about specific solutions — like security or identity management — rather than about technology gives clients that “lightbulb moment”. They can more easily comprehend what these solutions conceptually might bring to their business.

A recent example was a mid-market customer— very much in the “white space” in IBM terms — that we worked with in Germany. The business wanted to automate its ordering process. And using IBM Cloud Pak for Automation we were able to bundle together the products needed to create that solution under a single license, rather than negotiating individual licenses for the individual products.

That customer ultimately signed on to a five-year commitment to our solution, and that was in no small part thanks to the simplicity of sourcing the technology they needed to run a compelling solution.

As that engagement demonstrates, bundling products into solutions in the form of Cloud Paks makes technology easy to understand — and more crucially, easy to consume. Of course, one size can never fit all, so some of the bundled products and the way they are measured may be a more difficult fit. But the concept and the way it is executed “for the majority” resonates strongly with our clients.

Ultimately, it’s a simple concept. And at a time when clients are looking for help in accelerating their journey to cloud, that simplicity is a major competitive advantage. We’ve already seen it create a recognizable brand for IBM Cloud Paks.

Now when we talk to clients about cloud, there’s a good chance they already know about Cloud Paks. That makes selling IBM to our customers even easier.

Source: ibm.com

Monday 27 January 2020

Key IBM AIX security features you can’t afford to miss

IBM Study Materials, IBM Guides, IBM Learning, IBM Tutorial and Material, IBM Prep

Reducing cybersecurity risk is a central concern for businesses today. As hackers become more sophisticated, and as more business is done on mobile devices, risks have increased, and that means organizations may benefit from a “defense in depth,” or multilayered, approach to their security strategy. IBM Power Systems and the POWER9 processor facilitate the “defense in depth” security approach by providing key security features for hardware, operating systems, firmware, hypervisor and security tool suites like IBM PowerSC.

IBM Power Systems’ security capabilities also support a wide range of operating systems, and in this post we’ll focus on key IBM AIX security features you should consider adopting as you move to IBM POWER9.

1. AIX Secure Boot and AIX Trusted Execution


The Center for Internet Security (CIS) provides universal cybersecurity recommendations that are applicable to all types of organizations. CIS ranks “Inventory and Control of Software Assets” as the second prioritized control out of 20, where each control is essentially a category of cyber defense. The IBM AIX Secure Boot and Trusted Execution features fall into this category. Using the CIS 7.1 standard as a basis, I believe these tools are two of the most important cybersecurity defenses for your AIX enterprise environment.PowerVM’s Secure Boot feature uses digital signatures to verify the integrity of PowerVM firmware, including HostBoot, Power hypervisor (PHYP) and partition firmware (PFW). The AIX Secure Boot extends the firmware verification done by the PowerVM Secure Boot feature by cryptographically verifying the authenticity of the OS bootloader, the kernel and the runtime environment, including device drivers, kernel extension, applications and shared libraries.

After AIX Secure Boot has verified the integrity of the boot process, you can then use AIX Trusted Execution to safeguard the integrity of the AIX runtime execution environment by cryptographically verifying the integrity of scripts, executables, kernel extensions and libraries that are loaded by the AIX kernel after the system has completed the secure boot process. When correctly utilized, AIX Secure Boot and AIX Trusted Execution are designed to provide a powerful measure for preventing or detecting malicious code executing on your POWER9 AIX systems.

Why are these two features so important? In numerous security breaches, attackers commonly use malware. In some breaches, attackers have used multiple types of malware to facilitate their successful breach. Attackers can also use hacking tools to enable them to further penetrate a victim’s environment. Additionally, these two features are part of the prioritized cybersecurity controls recommended by the Center for Internet Security’s CIS 7.1 standard mentioned above. This second control states: “Actively manage (inventory, track, and correct) all software on the network so that only authorized software is installed and can execute, and that all unauthorized and unmanaged software is found and prevented from installation or execution.”

The following are prerequisites for AIX Secure Boot:

◉ Hardware: POWER9 systems (Power E950 and above)
◉ Firmware: 920
◉ HMC: Release 9 Version 920
◉ AIX: AIX 7.2 TL3 SP1 (72M)

NOTE: The IBM PowerSC Graphical User Interface provides centralized management functionality to simplify management of Trusted Execution to multiple AIX partitions.

2. New cybersecurity compliance profiles available with PowerSC Graphical User Interface


IBM PowerSC is an integrated technology designed to assist Power Systems clients with general cybersecurity and cybersecurity compliance in cloud and virtual environments. It can help you save time and reduce risk by increasing visibility across your IBM Power Systems stack. PowerSC 1.3.0.0, which was released on December 13, 2019, has provided two new security hardening profiles. The PowerSC Graphical User Interface provides the ability to apply a set of recommended settings to multiple systems.One of the new security hardening profiles is based on the CIS Security Benchmark settings for AIX 7.1. This new CIS profile provides universal security hardening settings that can be utilized by all AIX enterprise environments using AIX 6.1, 7.1 or 7.2.

The other new security hardening profile is for Department of Defense (DoD) organizations. This is the new DISA STIG profile.

3. Fileset changes


One of the goals in reducing the attack surface of any operating system is to not install software that’s not needed on the operating system. Eliminating unnecessary software can not only provide fewer elements for a hacker to exploit but can also reduce the superset of software that must be managed for security patches.To provide you with more control over the software that’s installed on your system, the bos.net.tcp.client and bos.net.tcp.server filesets in IBM AIX are split into 33 new filesets. This new fileset design allows you to design more granular build images that only include the filesets needed by your system.

4. In-core cryptographic functionality


The OpenSSL version 1.0.2.1100 fileset and AIX 7 with 7200-03 can use the in-core cryptographic function that’s available starting with POWER8 systems. This new support is engineered for better performance when cryptographic operations are involved with the following ciphers:

◉ AES-128-CBC
◉ AES-192-CBC
◉ AES-256-CBC
◉ AES-128-ECB
◉ AES-192-ECB
◉ AES-256-ECB
◉ AES-128-GCM
◉ AES-192-GCM
◉ AES-256-GCM
◉ AES-128-XTS
◉ AES-192-XTS
◉ AES-256-XTS
◉ SHA1
◉ SHA224
◉ SHA256
◉ SHA384
◉ SHA512

Although this feature is more directly related to performance, it’s also related to cybersecurity since we have seen that the utilization of more computationally intensive cryptographic ciphers is sometimes hindered because of the hit to performance. So, removing any possible hinderance to utilizing computationally intensive ciphers can result in improving security in certain instances.

Defend yourself against cyberattack


Cybercriminals are making significant strides in improving their ability to attack organizations.  This cyber war is a constantly moving target, as hackers never stop creating new methods for attacking organizations. A defense in depth cybersecurity approach is fundamental to reducing your security risk. The features mentioned in this post are four positive steps towards realizing a robust defense in depth security implementation that may be the difference in preventing or reducing the effects of a data breach for your organization.

Source: ibm.com

Sunday 26 January 2020

The Top 10 storage moments of 2019

IBM Study Materials, IBM Guides, IBM Learning, IBM Tutorial and Material, IBM Certification, IBM Online Exam

2019 was a big year for IBM Storage, with a slew of exciting launches of new solutions, fascinating and valuable reports, and deep dives into the ways in which storage can help your organization continue to innovate and drive value from your oceans of data.

But amongst all that great news, what stands out as the best storage moments of 2019? Read on to find out the top ten, presented in no particular order.

1. Storwize V5000 Launch


In April, IBM launched the new Storwize V5000 family of offerings to complement Storwize V7000 Gen3. The Storwize V5000 models support end-to-end NVMe and include industry acclaimed IBM Spectrum Virtualize software to provide increased performance and enterprise-class functionality, availability, and reliability in an easy-to-buy, easy-to-use, and easy-to-manage entry storage systems. These models demonstrate the IBM storage commitment to simplify your modern IT and hybrid multicloud infrastructure.


2. DS8900F Launch


Designed for mission critical production systems and with strong synergy with IBM Z and the cloud, the new family of all-flash arrays launched in September. The DS8900F provides enhancements in performance, modern data protection, cyber resiliency, availability and cost-efficiency – all of which are demanded in mission-critical hybrid multicloud mainframe environments – with simple and secure support for data to the cloud.

3. Spectrum Discover Launch


The launch of IBM Spectrum Discover in July provided tools to help enterprises address the challenges of unstructured data across multiple vendors (IBM, Dell/EMC Isilon, and NetApp), the cloud, and, even, backups. IBM Spectrum Discover provides unified metadata management and insights for file and object storage that can be leveraged for storage optimization, data analytics and AI. With extended support and open APIs, IBM Spectrum Discover delivers insight into an expansive universe of file and object data.

4. IBM Storage for Red Hat OpenShift Container Platform Launch


The IBM Storage for Red Hat OpenShift Container Platform, which launched in August, is a reference solution bringing together IBM Solutions and open source technologies. IBM Storage offers new advantages and benefits for enterprises moving into hybrid multicloud operations, including better security, service orchestration, infrastructure agility, performance, and availability. The IBM Storage portfolio supports IBM Cloud Paks as well as standalone applications (including DevOps, database, HPC, analytics and AI workloads).

5. TechQuickie Autonomous Driving Vehicle Collaboration


This exciting video from April explains how self-driving cars work, revealing that an individual vehicle can generate up to 15 terabytes of data every hour. Fortunately, IBM Spectrum Storage, automized for AI and machine learning with industry-leading GPU-accelerated servers, helps automakers manage all of this data to keep those self-driving cars safely on the road.


6. Elastic Storage System 3000 Launch

The launch of the ESS 3000 in October addressed a critical area of emphasis for AI, big data and analytics workloads – the management and storage of unstructured data.

IBM provides market-leading solutions for the requirements of AI-driven organizations, addressing every stage of the pipeline from insight to ingest. Leveraging IBM’s industry acclaimed IBM Spectrum Scale, the ESS 3000 is the latest in IBM’s line of high-performance, highly flexible scale-out file system solutions engineered to handle the toughest unstructured data challenges with the ultra-low latency and massive throughput advantages offered by Non-Volatile Memory Express (NVMe) flash storage.

7. Gartner 2019 Magic Quadrant Leader in 4 Reports


This year, IBM was honored to be recognized as a Leader in four 2019 Gartner Magic Quadrant reports: Distributed File Systems and Object Storage; Critical Capabilities for Object Storage; Primary Storage; and Data Center Backup and Recovery Solutions.  The Magic Quadrant for Data Center Backup and Recovery Solutions (October 2019) was the eighth report in a row where IBM was been recognized for both its completeness of vision and its ability to execute in the data protection market, driven by the proven ability to provide a highly scalable, efficient and modern data protection platform that safeguards client’s data.

8. IDC’s 2019 Worldwide Datacenter Support Customer Satisfaction Study Rates IBM #1


In November, IBM took the top in IDC’s 2019 World Datacenter Support Customer Satisfaction Study, which looked at twelve leading vendors and found that support and services are a key differentiator that puts IBM at the head of the pack. IBM also received the best NPS rating from respondents of any of the storage vendors in the study, excelling in overall support services contract satisfaction, as well was satisfaction with the ease of doing business with the company.

>Check out the InfoBrief

9. IBM Storage eBooks


Throughout the year, IBM Storage developed resources for enterprises looking to expand their knowledge and capability in various important areas of storage. Subjects covered included AI & big data, cyber resiliency, modern data protection, and NVMe over Fibre Channel.


10. IT Brand Pulse Flash Leader for NVMeof


IT Brand Pulse selected IBM as Market Leader for the new category of “All Flash NVMe-oF Array,” covering all-flash arrays for block or file storage with NVMe over FC, NVMe over RoCE, or NVMe over TCP interfaces. In addition to being overall Market Leader, IBM was also named Performance Leader, Reliability Leader, Service & Support Leader, and Innovation Leader (in a tie).

Saturday 25 January 2020

Measure trust with blockchain technology

As an enterprise, you want your business network to trust you. However, often times the cost of trust is not in what we can trust, but in what we cannot trust. These costs are rarely accounted for in business networks. Normally, account reconciliation techniques are in place to act as a catch-all for untrusted parties. This means that with immutable proof and consensus within a business network, you can significantly reduce or eliminate the need for data reconciliation.

IBM Tutorials and Materials, IBM Study Materials, IBM Guides, IBM Learning, IBM Blockchain

While closely examining over thirty retail enterprises, we found that the ability to trust your business network is a critical factor in supply chain success. Creating trust, however, isn’t as straightforward and to establish trust requires a measured approach.

The three measurable areas of trust


PWC’s Global Blockchain Survey in 2018 found that 45 percent of companies investing in blockchain technology believe that lack of trust among users will be a significant obstacle in blockchain adoption. This is due to uncertainty with regulators and concerns about the ability to bring business networks together. However, by confronting this trust gap early, we’re able to plan cybersecurity and compliance frameworks that regulators and stakeholders will trust.

Generating trust within intracompany business networks is a critical component of customer success. Fortunately, looking at data from supply chain networks across the retail industry, we identified three key measurable areas often considered the foundation of enterprise trust.

1. Validity of data. An organization has, to the extent and to the effect, accurately sourced     information that holds true when shared with the consumer.

2. Governance of data. An organization has defined fair business rules to manage data and to align with business processes.

3. Reliability of data. An organization acts consistently and proactively, in a timely, thought-driven manner.

The adoption of blockchain technology is measured by how well it is understood and trusted. Generally, enterprises hesitate to embrace an emerging technology, especially one that requires a new method in sharing data. Blockchain technology has proven several concepts across multiple industries, delivering value in a manner that delegates data management back to the consumer and to regulatory bodies. As we embark into the next phase of scalability, we’re beginning to zoom into the immutable proof this technology demonstrates with the ability to further define measures and report results that not only impact an enterprise, but also an industry.

With more than a decade of this transformative technology, through the rise of digital commerce, trusting business networks have raised significant questions in transforming IT. As a result, this transformation has redefined technical stacks and has developed new data models. However, technology cannot scale trust within a transaction, hence, to reduce the cost of trust, we must identify quantifiable metrics.

How do you quantify trust?


Today, blockchain presents an interesting predicament in corporate America with businesses boxed by traditional infrastructures and long-standing processes. With the rise of blockchain technology, businesses are demanding data quality reviews prior to progressing into solutions that apply automated business logic across a value chain.

IBM Tutorials and Materials, IBM Study Materials, IBM Guides, IBM Learning, IBM Blockchain

Although trust is a qualitative attribute, the measure of trust depends on quantifiable metrics. Quantifiable measurements of trust include:

System behavior. Programs and applications that capture the movement of data, as exhibited by transactions, consensus or votes.

Content analysis. Using techniques like Natural Language Processing (NLP), deep learning and AI/ML to analyze structured and unstructured data.

IoT and web-based identities. IoT devices, QR codes, and web-based identities for cybersecurity.

By capturing quantifiable data, enterprises begin to establish governance with a collection of automated business processes, improving regulatory compliance and safeguarding the customer experience. The expansion of blockchain data depends on the migration of enterprise legacy data into building blocks, defining rules for each block that results in significant enhancements to business outcomes.

How does trust interoperate?


Today, data silos exist within business networks that can lead to inaccurate information. Compatibility between data silos requires stronger interoperability between all parties within a network. In order for compatibility to exist, trusted networks must share information that complies with regulatory standards and industry-wide initiatives, in real-time.

Interoperability in business networks for blockchain consumption can be demonstrated by:

◉ Integrate databases to align based on identity

◉ Enable a data engine, like Watson, to seek keywords

◉ Parse notes with techniques, like NLP, to structure data

◉ Define rules within measures to validate data

◉ Translate business rules into a smart contract to automate a business process

This generates an immutable record on the blockchain. When you begin to acquire blockchain records, you then begin to track and report measures that are linked with industry-wide initiatives, resulting in the proactive management of data, automating a business process atop unstructured data and across supply chains. To correlate this a step further, based on the logic within smart contracts, automating a business process, we eliminate the requirement of trust within a business network, or in other words, we increase trust with immutable records.

Blockchain automates trust


Blockchain has become a technology on which to build tools that automate trust. In a collaborative economy, that means trusting enterprises not based on reputation or brand but based on the immutability of blockchain. The cost of trust is high, and we incur risk each time we cannot completely trust our business network. If trust is measured, then the cost of trust becomes marginal when two parties trading goods must trust each other and the business network to govern a transaction.

Source: ibm.com

Thursday 23 January 2020

Where to find help on your pervasive encryption journey

IBM Study Material, IBM Online Exam, IBM Certifications, IBM Exam Prep, IBM Learning

The threat of data breaches remains very real — we hear about it often in the newspaper headlines and on TV. Security is a high-priority issue for businesses, and as technology continues to evolve, you need security solutions that can keep you one step ahead of the threats.

IBM z15 was recently released, and I’m expecting it to be a game changer in the security industry. With IBM z14, IBM introduced pervasive encryption, a consumable approach to enable extensive encryption of data in-flight and at-rest. With IBM z15 and with IBM Data Privacy Passports, IBM Z is designed to help clients get transparent, end-to-end data protection and data privacy — even when the data moves off of IBM Z. With these features, IBM Z offers a security-rich environment for your data.

Pervasive encryption: Are you ready?


Pervasive encryption has been available since 2017, taking advantage of the synergy between the IBM Z software stack and the incredible encryption capabilities of IBM z14. Two years after its release, some clients are still asking, “How do I know if I’m ready to implement pervasive encryption?” and “How does it work?” If you’re are still asking those questions, the Pervasive Encryption Readiness Assessment (PERA) from IBM System Lab Services could be an enormous help. After a short data collection, we provide a one-day, onsite workshop (with all the teams involved in the project) that includes deep-dive education, an assessment of your readiness and a suggested implementation roadmap. Lab Services has delivered this offering to 20 clients in North America so far, and now they all know the next steps on their path to pervasive encryption.

Tools and features to support your pervasive encryption journey


The main component of pervasive encryption is z/OS data set encryption. With z/OS data set encryption, you can encrypt data sets automatically, with no application changes and a minimum overhead on z14 or z15. There are some technical requirements: The data sets need to be in extended format to be eligible for encryption. If you don’t know how many of your data sets are in extended format, IBM z Systems Batch Network Analyzer (zBNA) can help. This no-cost tool helps you assess how many of your data sets are eligible for z/OS data set encryption. Even better, it gives you the expected CPU overhead once encryption is turned on for these data sets.

Do you know that if you enable IBM zEnterprise Data Compression (zEDC) (you need zEDC cards on z14 or the in-core compression of z15), you can probably offset a big part of the encryption “cost”? Again, zBNA can help you simulate this and make the right decision. And if you don’t want to run it yourself, Lab Services can help you.

With z/OS data set encryption, when implementation time comes, you have to reallocate the data sets to enable the encryption. It can take time if you have to implement a process to force the reallocation of the data sets if they’re not “naturally” reallocated. And when it comes to files allocated to your online processes (like VSAM files in CICS file-owning regions), if you’re using a long manual process, you want this re-allocation to complete as quickly as possible.

Have you heard about IBM zOS Data Set Mobility Facility (zDMF)? With this product, you can encrypt hundreds of files without writing any manual processes. And your online files can be encrypted in seconds. This tool is so powerful that, after we recommended it to a client, one system programmer said he’s now able to encrypt all the files that require encryption in his company’s IT environment without assistance.

Help and advice from the professionals


If you feel ready for pervasive encryption but still don’t know how to move forward and need some help, IBM Systems Lab Services can help you all the way from a “proof of concept” to assistance throughout your implementation. Lab Services can help clients with z/OS and Linux implementations. If we look at the North America clients where we delivered a Pervasive Encryption Readiness Assessment, 40 percent of them have engaged or are engaging IBM Systems Lab Services in the journey to pervasive encryption. And this assistance can include all the components you might need to secure this deployment, from basic “bricks” to key management solutions.

One piece of advice while you’re thinking about data security: Don’t forget to check your z/OS security baselines. There’s no point in encrypting all your data if you have weaknesses in your security baselines and any skilled hacker could sneak into your systems and steal the encrypted data anyway … or worse, encrypt it, get the key and then offer it back to you in exchange for lots of money. There’s no known story of ransomware on IBM Z so far, and we don’t want to hear one anytime in the future. Again, Lab Services can help you to assess your baselines. It’s also a good idea, if you haven’t yet done so, to implement real-time monitoring of your IBM Z environments. The synergy between products like zSecure (which can also make security health checks easier) and QRadar makes it possible with ease.

Data privacy today


Now, let’s talk about data privacy. Have you heard of the characters Alice and Bob, created by Ron Rivest, Adi Shamir and Leonard Adleman, the RSA encryption algorithm inventors, in 1978? These characters wanted to exchange private messages using encryption.

Today, the world has changed, and with the privacy challenges in social media and elsewhere, along with new regulations protecting citizens’ data, we need data privacy. The Alice of today wants to exchange private messages with Bob but she also wants to have full control of her data and its usage. IBM Data Privacy Passports is the IBM answer to these challenges. Clients can protect data and enforce permitted use of that data when it is shared off-platform with IBM Data Privacy Passports, a data-centric audit and protection (DCAP) solution built on IBM Z security. With it, users like Alice will be able to control who has access to their data at a granular level and have full control of the data lifecycle — even when the data leaves IBM Z to go to a cloud environment, for example.

IBM Data Privacy Passports is available as a “beta” to clients that want to participate in this new, exciting journey toward data protection and privacy on IBM Z. And again, IBM Systems Lab Services will be there to help you. Isn’t is an exciting future in front of us?

Wednesday 22 January 2020

Five trends to shape blockchain in 2020

IBM Study Materials, IBM Guides, IBM Learning, IBM Blockchain, IBM Tutorial and Material

The year 2019 was pivotal for enterprise blockchain. The technology expanded beyond adoption by innovators and first movers to include a growing number of organizations working together to rapidly turn blockchain’s promised value into tangible business results.

As active blockchain networks bring real transformative change to a number of industries, the IBM Blockchain team conducted interviews across our vast group of technology experts, researchers, and those partnering closely with clients across industries to pull together the following five key themes we expect to materialize over the coming year:

1. Pragmatic governance models will emerge


With greater blockchain adoption on the horizon, governance will become a key factor. Yet, creating a governance model that all participants agree upon can be challenging. In fact, we have found that 41 percent of organizations believe lack of uniform governance standards across partner organizations to be the most significant challenge to progressing their blockchain proof of concept (PoC) or minimal viable ecosystem (MVE).

In 2020, we’ll start to see new governance models that enable large and diverse consortia to approach decision-making, permissioning schemes, and even payments more efficiently. These models will help to standardize information from different sources and capture new and more robust data sets. In the next one to three years, we learned that 68 percent of CTOs and CIOs even expect to see a scalable governance model for interactions across multiple blockchain networks to be an important feature of their organization’s blockchain environment.

To get others to agree with the group — especially those key contributors that single-handedly make the network more valuable — there needs to be a willingness to cooperate and collaborate. Sometimes this is achieved by incenting participation. This year, members of an existing network may encourage strategic industry players to join using monetary incentives. For example, a global supply chain consortium might subsidize members of a government-regulated customs authority agency to join a network, based on the fact that their participation, as well as their data, will allow the network to be exponentially more impactful.

2. Interconnectivity comes one step closer to reality


We have found that success in blockchain relies on collaboration from multiple parties. But, with the potential for tens or hundreds or even thousands of participants on a network, it’s unreasonable to expect that each party within a network will use the same vendor or incorporate a new computing environment for just one application. Even so, there’s an exceptional need for businesses to seamlessly share data.

Though reaching interconnectivity at the maximized level might be years away — and the definition of interoperability can take many forms  — we find that 83 percent of organizations today believe assurance of governance and standards that allow interconnectivity and interoperability among permissioned and permissionless blockchain networks to be an important factor to join an industry-wide blockchain network, with more than one-fifth believing it to be essential. Although there’s still work to be done on this front, this year as more emerging networks attain critical mass, we’ll find that more members of a single network will expect (if not demand) guidance on integration between different protocols.

3. Adjacent technologies will combine with blockchain to create a next level advantage


Now that blockchain solutions are capturing millions of data points and making their presence felt in the world, they’re opening the door to new capabilities. Adjacent technologies like IoT, 5G, AI and edge computing — to name a few — will combine with blockchain to drive enhanced value for network participants. For example, blockchain solutions that pair with the Internet of Things and AI, compared to other emerging technologies, are expected to be the top accelerators of blockchain-enabled marketplaces in the future.

Combining adjacent technologies with blockchain will help us to do things that haven’t been done before. More trustworthy data from the blockchain will better inform and strengthen underlying algorithms. Blockchain will help keep that data secure and audit each and every step in the decision-making process, enabling sharper insights driven by data that network participants trust.

4. Validation tools will begin to combat fraudulent data sources


With 88 percent of institutions, according to our research, believing that the assurance of standards to communicate data to and from blockchain networks is an important factor to join an industry-wide blockchain network, there’s no question that trust and transparency is essential. But, in a world where data is being collected and transferred faster than ever, it’s understood that there will be inconsistencies in our data, either from human error or malicious players.

With a need for heightened data protection mechanisms, this year, blockchain solutions will use validation tools along with crypto-anchors, IoT beacons and oracles, mechanisms that link digital assets to the physical world by injecting outside data into networks. This will improve trust and remove the dependency on human data entry, which is often prone to error and fraud.

5. Central banks will expand into wholesale and retail Central Bank Digital Currencies


Tokens, digital currencies and central bank-backed digital currencies (CBDCs) have been a growing topic of interest for capital markets. Tokenizing assets and securities, converting them to digital tokens, and then trading, exchanging and settling custody of such digital assets is transforming the efficiency,  security and productivity of capital markets. In fact, 58 percent of organizations we surveyed agree that they can derive new sources of revenue by tokenizing assets exchanged on a blockchain-enabled marketplace. In addition, new organizations and regulations have even been put in place to facilitate the creation, handling, trading and settlement of such tokens and digital currencies.

What changes can we expect to see in this field for 2020? With countries in Asia, the Middle East and the Caribbean beginning to experiment with CBDCs in real time, there is no doubt that they will continue to gain momentum in the new year and redefine payments in several ways. For one, CBDCs will see continued expansion in wholesale CDBCs, with some initial forays in retail CBDCs. Moreover, we find there will be increased interest in tokenization and digitization of other types of assets and securities such as central bond debentures for treasury bonds.

Closing thoughts


While spending time anticipating the future of this innovative technology is extremely exciting, we recognize new dynamics are continually entering the market that may challenge these trends as we see them today. There are also plenty of promising trends, like the rise of digital identity for blockchain, that we haven’t touched on here. However, one thing is certain, blockchain will continue to disrupt, enhance and improve the world we live in.

Tuesday 21 January 2020

Four ways to stem the tide of rising cybersecurity risks

Cybersecurity Risks, Power Systems, IBM Systems Lab Services
One of the greatest challenges in the IT industry is staying ahead of the cybercriminal. This is no easy task. The 2019 Cost of a Data Breach Report, conducted by the Ponemon Institute and sponsored by IBM Security, indicates that the chances of experiencing a data breach have increased from 22.6 percent in 2014 to 29.6 percent in 2019. In other words, organizations are now one-third more likely to experience a breach in the next two years. The increased success that cybercriminals are achieving underscores the importance for IT organizations of ensuring they’re providing the proper measures for reducing cybersecurity risk.

The following are my recommendations for organizations seeking to significantly reduce cybersecurity risk in their business:

1. Use a “defense in depth” approach


Many organizations deploy only a portion of the cybersecurity countermeasures that should be utilized. This can result in weak links in the chain of cybersecurity defenses. Even if most of an organization’s cybersecurity chain is strong, a cybercriminal can exploit the weak links, potentially causing a data breach that wouldn’t have been possible if a defense in depth approach was used.

A defense in depth approach consists of having many different layers of cybersecurity defense. If a layer is defeated by a hacker, there are still other security layers in place to thwart the attacker. An excellent example of such an approach to cybersecurity is found in the Center for Internet Security (CIS) Controls version 7.1.

2. If you’re going to deploy security defenses, do it right


Some cybersecurity defenses aren’t easy to implement, and some can be implemented in numerous different ways. The quality of your implementation could be the difference in whether or not you prevent a data breach. Some of the biggest data breaches in the last decade were due not to organizations failing to deploy the appropriate defenses but failing to deploy defenses properly.

Take as an example the reduction of unnecessary access. Reducing unnecessary access first requires understanding the subset of full access that’s needed for users to perform their jobs. Access is something that can vary from organization to organization depending on user requirements. Thus, you need to do your research in order to properly manage access. Depending upon the complexity of an organization, this could be something that could take weeks, if not months, to implement right.

3. Get your security and system administration teams working together


An organization can be exposed to greater security risk if its security plan was created with a lack of synergy between security and systems administration teams. Achieving robust system security requires both teams to share knowledge and work together to define security policies specific to their IT environment.

The system administration team can offer substantial help to the security team since it has a thorough understanding of the operating systems and application groups in the organization. Once the security team has done its research, it should define a security plan that details the organization’s security policy requirements, and the system administration team’s job is to abide by it.

4. Take advantage of firmware and hypervisor security features


Since a security system is only as strong as its weakest link, make sure your defense in depth strategy includes security defenses for the firmware and the hypervisor.

Here I’ll get more brand-specific since IBM Power Systems is the server group I know best. IBM POWER9 servers come with firmware and hypervisor security features designed to bolster an organization’s security efforts. We’ll talk about specific operating system security features in upcoming blog posts, but there are important developments that fall under firmware and hypervisor security that I suggest you consider.

For example, IBM PowerVM Secure Boot, which I consider an important security defense feature, allows only appropriately signed firmware components to run on the system processors. Using digital signatures generated by IBM, Secure Boot verifies the authenticity of the following components of your firmware stack:

◉ Hostboot

◉ Power Hypervisor (PHYP)

◉ Partition firmware (PFW)

An included framework provides remote firmware attestation using a hardware Trusted Platform Module (TPM). The attestation supports Trusted Computing Group (TCG) 2.0 compliant trusted boot.

Saturday 18 January 2020

Environments Where Blockchain Can Thrive

A Blockchain solution can flourish in business scenarios that have a high number of participants that all want to track a particular product or item. And the more complex the tracking process, the more a Blockchain application can thrive.

IBM Tutorials and Material, IBM Guides, IBM Learning, IBM Certification, IBM Blockchain

For example, if a product traverses through a series of steps that starts with its creation and ends with its delivery into the hands of a consumer, then incorporating a Blockchain solution into this process can potentially offer many benefits. For example, it can enhance the overall information security, as well as provide both substantial time and cost savings for all participants in this product’s life cycle.

Blockchain Benefits


To illustrate this further, when a purchase order is received by a manufacturer for a particular item, the product’s life cycle begins. Starting with the purchase order, the product’s manufacturer builds the product then hands it over to a shipper. This shipper then sends the item to a warehouser who then ships it to a wholesaler. This wholesaler then utilizes another shipper to have it sent to a retailer. The retailer then stocks the item until a consumer purchases it. Having a way for all participants in these steps to view where the product originated from, i.e. its provenance, and trace all of its handling can add many benefits, including:

◉ Transparency within supply chains
◉ Immutable information that can be available to all participants
◉ More efficiency in maintaining records
◉ Organized data for auditors and regulators
◉ Reduced or eliminated administrative record keeping errors
◉ Reduced or eliminated processing paperwork


Blockchain for an International Air Services Provider


Recently, an international air services provider, dnata, successfully tested the use of Blockchain technology in its cargo operations. This achievement is a real life example of the aforementioned scenario.

With the help of IBM and other partners, dnata developed a logistics platform with a Blockchain infrastructure. This platform was put into effect to view supply chain transactions, starting with the purchase order of an item and ending at its delivery to a warehouse. This business use-case exemplifies where a Blockchain can thrive: an environment that has a large number of participants wanting to track products through the supply chain.

Blockchain Solution for Asset Management


Another environment where Blockchain can thrive is when a company transfers assets within a business network. When a company internally transfers a physical asset such as a laptop, or in the case of trucking company, a semi trailer, from one location within its business network to another, there can be many people involved and much related paperwork to keep track of its journey. In this case, a Blockchain can establish a clear trail for the asset that has been transferred within this business network. Acting as a shared ledger, the Blockchain can allow internal company parties to view where the company’s assets have been moved to, who it was handled by, its current state, its past state, and how many times it has been transferred and even how many times it has been used – all from the same source, i.e. the shared ledger. And it can be viewed at anytime and by anyone that has permission to do so.

Also within the asset management process, there can be many issues including having transfer information split among many different record-keeping systems, conflicting information on transactional updates, and long wait times to resolve discrepancies. These can add to costs and subtract from efficiency. A properly executed Blockchain can be the sole source of transfer information, and reduce the number of asset discrepancies as well as the time it takes to resolve them.

Friday 17 January 2020

What if you could operate 10x faster at half the cost using cognitive RPA?

In a hugely competitive global industry, telecom operators must balance ongoing customer satisfaction against reducing operating costs. Too often, subscale technology investments are made for meager benefits, and automation is bolted on cumbersome processes supported by decade-old systems. Worse still, automation is often implemented without revisiting the underlying customer experience or evaluating what artificial intelligence (AI) could do to improve productivity.

The telecom industry runs rife with highly manual, voluminous, repetitive and complex rule-based transactions – things such as order validation, service fulfillment, service assurance, billing, revenue management and network management – and closely coupled with multiple legacy systems. With as result rigidity and lack of transaction visibility. In traditional lead-to-cash processes this has often led to poor service and customer dissatisfaction. Slowed by the process, customers are disincented to stay loyal, looking for a better offer elsewhere.

Reimagining the customer first


Companies are increasingly using robot process automation (RPA) to automate routine tasks. RPA’s potential benefits are manifold. They can include reducing costs, lowering error rates, reducing turnaround time, increasing the scalability of operations and improving compliance. By moving beyond basic robotics to intelligent interaction by combining RPA and cognitive technologies, telecom operators can replace tedious tasks and deliver costs savings and greater workforce productivity by:

◉ Striking a better balance between the front and the back office, while becoming faster and more reliable

◉ Enabling customers to self-serve so that sellers can focus more time on complex orders

◉ Cross-selling and up-selling through assisted sales and creating recommendations that all sellers know what the best sellers do.

The integration of cognitive technologies and RPA (see Figure 1) is extending automation to a new level, in this way helping telecom operators to become more efficient and agile and delivering more consistent services to the customer which is paramount in the current digital economy.

Figure 1. Moving beyond robotics to intelligent interactions

IBM Tutorials and Materials, IBM Guides, IBM Learning, IBM Exam Prep, IBM Study Materials

Entirely new user experiences can be achieved by taking an over-the-top (OTT) approach. The idea is to preserve the legacy system’s (also known as systems of record) capabilities to be the custodians of the business transactions. By interfacing with the “systems of record” through existing APIs or microservices, one can redefine the user experience more freely and use a combination of capabilities ranging from business rules engine, business process and management (BPM), AI, RPA and blockchain.

Using these tools in conjunction, and playing to the strengths of each of them, business benefits are amplified beyond what is achievable by overextending a single technology. This enables transparent and flexible automation of the business process in response to business needs. Technologies embedded with RPA can provide autonomous decision making, enable reasoning and remembering, and provide new insights and data discovery. For example, using AI and RPA technologies as part of a sales order management process can guide the seller to improve data accuracy by making recommendations that improve over time.

A process automation platform that sits “over the top” of existing IT interacts with IT but doesn’t require significant change. It lets telecom operators design the customer experience they want, then implement the transformed process to support that experience. Since intensive manual intervention isn’t required, the benefits of creating the process and experience supported by a process platform have positive impacts on productivity, cycle time, cost and customer satisfaction.

Massively reducing operational expenditure


Faster order fulfilment at half the cost, more self-serve options so sellers can pay more attention to complex orders and intelligent lead-to-cash processes that steers the back-office towards more areas of value are key to change telecom (see Figure 2, a tier 1 telco).

Figure 2. Example of operator transformation business case

IBM Tutorials and Materials, IBM Guides, IBM Learning, IBM Exam Prep, IBM Study Materials

Dramatic cost reductions can be realized in incremental sprints, with significant meaningful change possible in just months. It requires focus on the following areas:

◉ Build a business case with the line of business or shared services that will see the greatest gains and align incentives to make collaboration happen. Start with relieving pain points and improving the user experience.

◉ Decouple legacy from a new user experience. Don’t rebuild IT but breathe new life into interactions with older, legacy applications to provide the much-needed budget relief.

◉ Create a center of competence that includes design thinking approaches and build AI and robotic content libraries and extensive industry-specific process flows and business rules. Consider AI process platforms to automate perceptual and judgment-based tasks through the integration of capabilities such as natural language processing, machine learning and speech recognition.

We’re at a key inflection point, moving from a world of processes run by humans supported by technology, to processes run by technology supported by humans. Where are you in seizing the opportunities that cognitive RPA offers?

Thursday 16 January 2020

The hot storage trends for 2020

IBM Tutorials and Materials, IBM Certification, IBM Learning, IBM Exam Prep, IBM Online Exam
Now that 2019 has ended, we anticipate incredible storage advancements to come in 2020. Storage is the essential foundation for all your application, workloads, and data sets. If your storage is not reliable, resilient, performant, and flexible, the value of your most critical business asset–your data–decreases dramatically.  Read on to see what is coming your way in 2020 to optimize that essential data foundation–your storage.

1. Storage for hybrid multicloud


The role of data has changed. With the advent of hybrid multicloud environments, businesses across the globe continue to see exponential growth in the amount and types of data they produce. Their long-term success depends on their ability to optimize these vast oceans of data seamlessly across cloud configurations.

In 2020, hybrid multicloud storage will support exponential growth across your entire storage estate, which most likely will be in a hybrid multicloud deployment. To make sure you are prepared for the hybrid multicloud world, you should be sure your enterprise can easily and transparently move data from on premises to your various cloud providers and back. The “cloudification” of storage will continue as enterprise businesses select the right storage for the right job, no matter the cloud environment. For this, hybrid multicloud storage has become a reality across all types of data and storage environments and will continue to be top-of-mind as enterprise business data continues its incredible expansion.

For more on NVMe over fabrics, check out NVME over Fibre Channel for dummies.

2. Acceleration and optimization of containers


As data and hybrid cloud grows, so, too, should investment in container usage. For businesses developing in hybrid multicloud environments, containers enable ease of data portability and movement across their organization. Some organizations have lots of containers, thousands in fact, creating a new “virtualization” and development layer.

At first, only developers used virtualization, but we have seen its use has spread across almost all businesses with the advent of VMware and other server virtualization platforms across the data center. Today, we’re seeing the same thing happen with containers. It’s no longer just for DevOps, but across for your data centers and clouds. As containers sweep into enterprise data center and cloud deployments, the discussion includes optimizing containers for persistent storage in your primary storage both on premises and spanning a hybrid multicloud deployment.

With containers moving more and more into the primary storage arena, the issue of how you are going to deliver modern data protection for your container environments gets more and more critical.  In 2020 we are expecting to see a rapid expansion of the marriage of storage and modern data protection to your container environments. As you look to use containers, make sure that you can optimize your containers for primary storage across all protocols—file, block and object–and that you are able to deliver the best in modern data protection for you containerized primary storage.

3. Storage for AI


AI workloads are growing dramatically and storage has become a critical foundation for AI success. The key to storage for AI is to have a single, vast repository of storage that is easily tied to AI and to your machine learning and deep learning assets. The single repository that can span physical data centers and cloud configurations must be able to support exabytes of data. Using a sophisticated AI data pipeline that encompasses data ingestion, organization, analysis, machine learning, deep learning, and archival, the modern AI business will see tremendous benefits as AI becomes commonplace across enterprises. As you search for your AI technology partner, it’s worth noting (humble bragging) that according to IDC IBM is #1 in global market share for AI, including services, software and physical infrastructure.

4. Cyber-resilient storage


As you refine your organization’s security strategy in 2020, remember that storage is a key element to your overall cyber resilience capabilities. It is not just about “if” you will suffer a security attack, but “when.” Most security strategies today center on the ability to prevent security breaches and, when one occurs, to solve that attack. However, as CIOs across the world will tell you, those attacks may take hours, days, and, even, weeks to remedy.

With such unlikelihood of getting your systems free from data theft, data corruption, malware, or ransomware, your storage infrastructure is essential to prevent the impact of cyber-attacks on your company and your company’s data. From encryption at rest and in-flight, to air-gapping, to malware and ransomware detection, to rapid recovery from a cyber incident, to multifaceted administration controls, your storage infrastructure must deliver the right technologies to create a holistic cyber security strategy for your corporate data.

5. End-to-end NVMe


In late 2018 and through 2019, the latest in high performance storage interface technology – NVMe – started to be incorporated into storage array solutions.  For example, IBM Storage delivered its first storage array NVMe solution in July of 2018. As 2019 progressed, the market was hearing more and more about NVMe over data center fabrics – fibre channel, ethernet and Infiniband. Throughout 2019 there was a lot of “kicking the tires” on the various versions of NVMe over fabrics, but not much actual user deployment. 2020 will likely be the inaugural year in which NVMe over fabrics starts to see real enterprise deployments.

6. Storage Class Memory


Storage Class Memory (SCM) is in its early phase of use as IBM and other technology partners begin shipping SCM to early adopters. The advantage is highspeed persistent storage for computing across complex datasets where performance is critical to your business operations. While 2019 saw lots of early announcements about SCM, 2020 will begin real deployments by end users. A common deployment scenario will likely be SCM in a hybrid configuration within an all-flash array.

Given the high price of SCMs, creating a hybrid array leveraging SCMs, all flash storage, and AI-based automated tiering (such as the Easy Tier feature in IBM Spectrum Virtualize) will provide the most cost effective, yet performant use for your high performance application needs.  Easy Tier places the data being used the most on the top tier (SCM), and data not being used gets shifted to less expensive storage.

Wednesday 15 January 2020

Blockchain and other emerging technologies for financial services

Blockchain and cryptoassets have, and rightly so, received a large amount of coverage and analysis during the last few years. In the background, and potentially representing technology tools and applications that might be more applicable for most organizations at this stage, artificial intelligence has continued to permeate an array of industry verticals.

IBM Blockchain, IBM Study Materials, IBM Guides, IBM Learning, IBM Tutorial and Material, IBM Online Prep

With the continuous debate around leveraging artificial intelligence to help transform autonomous vehicles from concept to reality, AI represents an equally hot topic. Even with this continued development and integration, however, there remains some ambiguity and confusion as to how this technology will impact the broader accounting and financial services sectors.

Some of this confusion comes from the conflation of AI with other automation technologies, some arises from the deluge of other technology tools — like blockchain — driving innovation in the market, and a portion is derived from the fact that AI is not any one single thing, but rather an umbrella term.

For any emerging technology, including blockchain, to operate as effectively as advertised, practitioners must not understand that specific tool, but how it connects with other cutting-edge tools such as AI.

Read these excerpts from my new book Blockchain, Artificial Intelligence and Financial Services – Implications and Applications for Finance and Accounting Professionals, where I begin to tackle some of the issues so important to accounting and finance professionals:

“The dual headed disruption tidal wave of blockchain enabled activities and artificial intelligence will invariably lead to anxiety, stress, and potentially misunderstanding of just what these technologies represent for financial services. Blockchain, hopefully, at this point has been demystified to a certain extent, but the idea of artificial intelligence may appear and seem like a more amorphous concept that is both difficult to understand but potentially disruptive in nature. While artificial intelligence has been featured in numerous media outlets, movies, and T.V. shows, the image that is most often presented to audiences and market actors is one that, almost invariably, has negative connotations and implications for the developers and users. Fortunately, while there have been numerous advances in the development and implementation of artificial intelligence, the limits of current iterations are still substantial. In other words there is no need to fear the Terminator coming for financial practitioner roles. Prior to diving into what the applications and implications of AI may very well be, however, it seems appropriate to first put forward a definition that makes sense in the context of this discussion. Not meant to be overly technical, but rather a working definition to assist financial professionals seeking to understand and explain the implications of AI, a working definition as follows is a workable option:

Artificial intelligence is either a computer program or suite of programs that can either augment or eventually replace the need for human engagement and oversight in entire processes or at least portions of processes.

Artificial intelligence may have initially received more attention and media coverage but has subsequently received less coverage and analysis recently due to the somewhat amorphous nature of the idea itself (Lee, 2018). Blockchain and cryptocurrency may also be difficult to understand and appear to be a relatively new concept in and of itself, but even in spite of this initial confusion and analysis there are similarities between these technology tools and preexisting options. A decentralized ledger system, otherwise labeled as a decentralized ledger technology (DLT) platform is, of course, different from current centralized options, but underlying components can be related to tools like Excel and Access. Additionally, cryptocurrencies are simply a representation of how several technology tools and aspects have been combined, namely the tools of encryption, peer-to-peer processing ability, and various components of consensus data verification are not, by themselves, innovative or unique.

Contrasting versus these tools or platforms, the idea and concept of artificial intelligence can appear to be murky and lacking a relation to current technology or processes. Also, especially for financial services professionals, the challenges and threats of automation, digitization, and increased efficiency do have the potential to displace and disrupt core functions of what financial professionals actually perform. It is true that automation and digitization are not new issues and trends in the accounting and financial fields, but these do seem to be accelerating as well as mirroring events that have already occurred in other industry sectors (Mehendale, 2018). Examples abound in the marketplace, including a recently highlighted example of how J.P. Morgan is leveraging artificial intelligence tools to improve the speed and efficiency with which contracts and other paperwork are reviewed and analyzed. This example, however, is only one example of how artificial intelligence is being used in the marketplace, not even drilling down into the work underway at IBM.”

“Artificial intelligence is often tossed around and discussed as if it represents on type of platform, technology, or platform, but this represents an incomplete view of just what AI means and can imply for professionals and organizations. Without diving too much into the technical weeds of the different classes and types of AI, the following categories are included but not limited to:

1) Computational AI
2) Linguistic AI
3) Spatial AI
4) Reactive computing
5) Limited Memory
6) Theory of mind
7) Self-awareness”

It is impossible to predict just how the simultaneously developing technologies of artificial intelligence, blockchain, robotic process automation, and other automation tools will ultimately impact the accounting and financial services space. Predicting the future is, even under the calmest conditions, a difficult task and it would be difficult to categorize the current business landscape as calm.

Blockchain has, and assuredly will, continue to occupy a prominent place in the technology conversation for years to come, but it is not alone. For practitioners and organizations to both recognize the benefits of emerging technologies it is important to understand them both as individual tools and as complementary resources.

With implementation and adoption continuing to accelerate both inside and outside of the financial services space, proactive practitioners and firms are well positioned to thrive. No matter what aspect is analyzed, it would be safe to say that 2020 is shaping up to be an exciting year.

Tuesday 14 January 2020

Oil & Gas Upstream Integrated Operations Evolution

IBM Tutorial and Material, IBM Guides, IBM Certification, IBM Learning, IBM Study Material, IBM Exam, IBM Online Exam

Once upon a time, a few large oil & gas companies developed point solutions to address specific production issues. A lot of lessons were learnt and a lot of money spent on bespoke software solutions, developed all under the quest for innovation. Many more niche solutions have been born from these early exercises and many academics developed papers and models around the new concept of ‘Integrated Operations’ (IO). Definitions vary through the industry, but evidence on true value realized was difficult to quantify.

Once these companies started to scale these bespoke solutions to standardization, then the ability to realize the same value on older, prized assets became less attractive, given these assets were developed many years ago under differing operating constructs and infrastructure. So, the return on investment case became less positive for expanding these solutions.

Under the environment of strong oil prices and innovation, most of these cases have mixed results (other than structured collaboration— in this context I refer to the formal process of team working, rather than collaboration centers).

Move onto current day where the focus has moved to reduction of operating costs within 3-6 month project cycles. Just knowing there is or was a problem is not enough to justify spending. In the industry, we have seen a dramatic change in how these projects are evolved with more expectations on cash flow from the business community.

Now that the industry now knows what can be done they ask how to improve their value chain using a process driven methodology on proven technology platforms across the enterprise. Involving all users in the value chain is no longer the domain of a niche solution, neither does this process get executed within the confines of a collaboration center. The enterprise is driving technology behavior to involve all users in the value chain rather than a specific group.

Before all the Change Managers provide their views on the well-trodden path of Change being important in IO, I hear you and agree. Change is more than teaching a user how to use an IO Center or a specific process and team working exercises. In fact, I see too many collaboration centers where Change Management teams have left an A4 piece of paper as the operating manual for their multi-million investment. The business community generally are left without any real context on how to use new tools and adopt new ways of working. Change in today needs to be different, and requires a much broader skill than traditionally accepted.

Today we are witnessing the emergence of a new form of Integrated Operations, I don’t want to call it ‘Next Gen….’ as this is not born from the same basics of IO and is challenging conventional thinking. Neither do I want to call it any other of the terms like ‘Born Smart..’ as this implies you have to chance to embed the right technology from the outset which does not apply to aged assets.

Today it’s not about technology, it’s about joining the dots, and I like what we are seeing emerge. We are moving to a more sustainable model that can adapt to the enterprise and leverage a wider audience without the context of a fixed working environment that is process driven to a controlled cost model and business outcome.

Instead, we now start to address some of the bigger questions, which is how to reduce cost of ownership in line with asset depreciation similar to models used in the car industry for example. IO ways of working must be able to flex to reflect that operators need to see a cost decrease over time in-line with the decreasing revenues from assets. This is a model I am now beginning to get some thoughts round, and look forward to applying those principles in the near future.

More Reading…