Thursday 27 May 2021

3 steps to help protect your B2B transactions and file exchanges from security threats

IBM Exam Prep, IBM Certification, IBM Learning, IBM Preparation, IBM Career, IBM Study Materials

Today’s cyber security landscape is evolving as hackers take advantage of digital and hyper connectivity to creatively access networks and systems. We have observed that ransomware attacks are increasing and using new tactics. And supply-chain attacks like SolarWinds and Accellion that take advantage of third-party systems and software to find backdoor entries into enterprises, spread malware with one of the common motives to steal sensitive data or disrupt operations. Whenever a breach occurs, it can take time to detect, is typically difficult to eradicate, and can cause ongoing and significant damage over a long period of time. Recent analysis by IBM estimates the average cost of a data breach at $3.86 million, with mega breaches (50 million records or more stolen) reaching $392 million.

So, what can you do to help safeguard your B2B transactions and file exchanges and mitigate risk to high-value digital assets? These three steps can help enterprises strengthen their security posture:

◉ Limit the exposure to threats.

◉ Limit the spread if it’s already inside your network.

◉ Recover and get back to business.

Let’s look at each one of these points briefly.

How do you limit the exposure? To help prevent intruders from sneaking into your trusted zones you should establish a strong foundation by covering the digital entry points where external information comes into your enterprise safe zones, starting with the most susceptible points to the least. Your IT security teams might be following best practices, like encryption, permission models, secure access and authentication. However, when it comes to internet-facing information and file exchanges with your trading partner community, you should have an even higher level of security that a defense-in-depth strategy provides.

With accelerating digitization, many enterprises today conduct large volumes of internet-based transactions. Implementing strong edge security for your Managed File Transfer (MFT) solution can help identify whether incoming payloads are clean and coming from trusted sources. It’s a complex challenge with thousands of trading partners knocking on your enterprise doors multiple times a day, using multiple routes and protocols, and delivering information in various formats. This does not make life easy for an MFT system and a security administrator. The inflow is never consistent, and the payload varies by size.

It’s like an airport terminal with thousands of travelers entering and exiting the terminal every minute of the day. Similar to an effective file transfer solution, the security gate helps manage the inflow, but you can imagine the risk even if one ill-intentioned person, behaving as an ordinary traveler, sneaks through. You also should have robust security capabilities that are built-in (like full body scanners at security gates) as well as advanced configured capabilities (think extra screening or K-9s sniffing baggage randomly).

In the case of MFT, a few of these security capabilities include: multifactor authentication, validating incoming connections in real-time with sources that are updated frequently, scanning the files for viruses before they land into the trusted zone and ensuring that no data ever lands on disk in the Demilitarized Zone (DMZ). Also consider the versatility of the edge security capability since it doesn’t operate in isolation. Implementing a robust edge security solution, like IBM Sterling Secure Proxy, with flexible options to configure and integrate with other solutions in your existing technology stack can be important to limiting exposure to security breaches.

IBM Exam Prep, IBM Certification, IBM Learning, IBM Preparation, IBM Career, IBM Study Materials
How do you limit the spread? Despite all your best efforts, there is a chance you might find a bad actor within your trusted zones. What’s important then is to try to prevent it from spreading further and wreaking more havoc. One way is by allowing only listed servers to talk to authorized systems. Another best practice is to avoid use of common protocols like FTP, and instead use proprietary protocols like IBM Sterling Connect:Direct over SFTP, which can help provide high-volume and security-rich enterprise file transfers. Restricting the number of endpoints and using proprietary protocols and a solution architected for enterprise-class secured file transfer, can help limit the damage due to the spread and assists in the next step – recovery.

How do you recover? Once you identify the impacted systems, you should immediately clean and restore the environment. Restoration can involve a complete rebuild of the systems from the operating system up and changing all credentials and certificates. The process often requires having multiple stores or managing them individually on each server, which gets complex very quickly and is time consuming. A solution like IBM Sterling Partner Engagement Manager (PEM) can make it possible to change all credentials and certificates in one place. With the use of campaigns, updates to credentials and certificates with trading partners can be handled automatically, saving time and limiting the risk and duration of business interruption.

There is no magic bullet to guarantee 100% protection from security incidents. However, by following these three steps you can decrease your risk exposure, limit damage and build resiliency into your systems to recover quickly.

Source: ibm.com

Saturday 22 May 2021

IBM CIO Office integrates IBM Z into the hybrid cloud

IBM Z, IBM Learning, IBM Preparation, IBM Career, IBM Prep, IBM Guides, IBM Learning

In a multi-technology landscape, it is increasingly difficult for companies to choose the path that eases management across the IT landscape. However, it’s a decision worth investing in, as making the right choice can enable a better user experience.

Many organizations today face challenges with their core IT infrastructure:

◉ Complexity and stability: An environment might have years of history and might be evaluated as too complex to maintain or update. Problems with system stability can impact operations and be considered a high risk for the business.

◉ Workforce challenges: Many data center teams are anticipating a skills shortage for some platforms within the next few years due to a retiring and declining workforce.

◉ Total cost of ownership: Some infrastructure solutions are evaluated as too expensive, and it’s not always easy to balance up-front costs with the lifecycle cost and benefits.

Trying to solve these challenges, IT organizations of all sizes have been investing heavily in a move to the cloud over the last few years. Many of them had been finding out that the usage of open and industry-standard APIs is very helpful.

Embracing a hybrid cloud approach that is based on open industry standards is better than opting for a platform-specific strategy. The IBM CIO Office determined the best approach for their needs and realized significant benefits, such as improving efficiency, agility and overall economics.

IBM Z is providing constant business value

“Our IBM Z® systems are still the high-performance, ultra-stable platform that our organization has relied on to support huge numbers of transactions for years and the assets running on these systems provide valuable content. That’s why it’s important for our organization that we include the IBM Z systems into our user portal,” says Eric Everson Mendes Marins, PCS_ZVM squad at IBM CIO Office. “We had been using a traditional IT operations model with individual management pillars for the infrastructure components. With that model it was hard and complex to manage the infrastructure assets across the multiple platforms. We had to develop scripts, and some administrator tasks had to be done manually. But what many people might not know is that IBM Z can be integrated seamlessly into an open hybrid cloud IT infrastructure.”

Creating a cloud provisioning portal for improved user experience

The CIO Office is helping its users to realize the potential of IBM Z, as well as the potential of IBM Power and x86 technology, demonstrating its power as part of an open hybrid cloud IT strategy.

“We aim to offer easy access and handling for our users—which means creating a portal for our cloud services that provide the access to our hybrid cloud IT environment,” explains Marins.

IBM Z, IBM Learning, IBM Preparation, IBM Career, IBM Prep, IBM Guides, IBM Learning
As part of the consolidation of the IT environment, the transition from the traditional IT operations model to a cloud operations model was implemented. For the integration of the multiple platforms into one cloud provisioning portal, the CIO Office wanted to use open standards and decided to use OpenStack APIs.

IBM Cloud Infrastructure Center provides an infrastructure-as-a-service (IaaS) layer for IBM Z, enabling virtual machines to be provisioned and managed and the automation of services, and providing a platform for building higher-level cloud services.

Cloud Infrastructure Center helps to meet the demands of hybrid cloud strategy. It provides a ready-to-use solution for a consistent user experience with its integration capabilities to cloud management tools and with built-in OpenStack-compatible APIs to provision and orchestrate cloud workloads.

Its integration capability increased flexibility and improved efficiency at the CIO Office with a common skillset, simplifying the IBM z/VM-based virtual machine lifecycle management on IBM Z and providing unified cloud provisioning.

The entire setup of Cloud Infrastructure Center was done very quickly, in a few steps, and supports the CIO Office to reduce the costs and complexity in managing a hybrid cloud environment.

“IBM Cloud Infrastructure Center allows us to substantially improve our infrastructure management and reduce cost and complexity to manage from simple to complex environments,” says Marins.

The main values of integrating IBM Z into the hybrid cloud approach are increased business acceleration, higher developer productivity, increased infrastructure efficiency, improved risk and compliance management and long-term strategic flexibility.

Source: ibm.com

Thursday 20 May 2021

Announcing Anaconda for Linux on IBM Z & LinuxONE

Linux on IBM Z, IBM LinuxONE, IBM Exam Prep, IBM Prep, IBM Preparation, IBM Certification, IBM Career

A clear trend is emerging in the era of hybrid cloud: winning enterprises will likely pull ahead by scaling the value of their data with AI.

For many IBM Z® & IBM LinuxONE customers, the enterprise platform often serves as the system of record for their mission-critical data and applications. Data scientists often look for open-source solutions, and we are committed to embracing and bringing open-source AI capabilities to Z and LinuxONE that can support real-time AI decision-making at scale.

Today, IBM and Anaconda, provider of the leading Python data science platform, are announcing the general availability of Anaconda for IBM Linux on Z & LinuxONE. Anaconda on Linux on Z & LinuxONE is the latest step toward bringing popular data science frameworks and libraries to these enterprise platforms, providing a consistent data science user experience across the hybrid cloud.

Data scientists who already know and love Anaconda can now expand their open-source data science experience to include IBM Z & LinuxONE, while continuing to work with their favorite tools and frameworks like conda, XGBoost and SciKit-Learn. This expands and enables choice in AI frameworks and tooling for end-to-end data science directly on the platform, including development, training, testing and production. Data scientists can benefit from the security capabilities, high availability and scalability of the IBM Z & LinuxONE platforms when implementing AI deployments targeting time-sensitive workloads or transactions when they are taking place. Anaconda runs natively on Linux on IBM Z, and through z/OS Container Extensions (zCX) on z/OS, the solution brings open-source data science tools close to key workloads, leveraging the data gravity of the Z and LinuxONE platforms.

Linux on IBM Z, IBM LinuxONE, IBM Exam Prep, IBM Prep, IBM Preparation, IBM Certification, IBM Career
According to new research commissioned by IBM in partnership with Morning Consult, 90% of respondents said that being able to build and run AI projects wherever their data resides is important. Workloads running on IBM Z & LinuxONE often need to adhere to strict latency and SLA requirements to support transactions that are key to our modern life such as online purchases. With Anaconda for Linux on Z & LinuxONE, organizations can perform AI analysis in close proximity to their data, addressing latency to deliver insights where and when they are needed.

Customers can start using Anaconda Individual Edition and Anaconda Commercial Edition today by downloading the Individual Edition or Miniconda installer, and following the associated installation documentation:

Individual Edition

Miniconda

Source: ibm.com

Thursday 13 May 2021

Secure Your Job By Taking IBM C2090-621 Certification

c2090-621, ibm certified designer: ibm cognos analytics author v11, ibm cognos analytics author v11

The IBM Professional C2090-621 Certification Program allows you the ability to earn credentials to demonstrate your expertise. It is designed to prove your skills and capability to perform role-related tasks and activities at a specified level of support. An IBM Professional C2090-621 Certification achievement demonstrates your expertise in the related IBM technologies and solutions to your colleagues and employers.

This C2090-621 certification provides learners to build advanced reports, active reports, and dashboards using relational data and uploaded files and enhance, customize, and manage professional accounts. For this certification, aspirants are advised to have the described level of skills, including experience using a modern web, a basic understanding of database concepts, SQL and Javascript, and how to present data visually.

The IBM Certified Designer - IBM Cognos Analytics Author V11 as of R3 is responsible for building advanced reports, active reports, and dashboards using relational data and uploaded files and enhancing, customizing, and managing professional accounts. This individual has project-related experience authoring complex and intermediate level reports and dashboards and can participate in project implementations as an influential team member.

Prerequisites

Knowledge of your business requirements IBM Cognos Analytics for Consumers (v11) WBT or equivalent knowledge IBM Cognos Analytics: Author Reports Fundamentals (v11) or equivalent knowledge.

This C2090-621 exam provides participants with an understanding of Active Report content and functionality within IBM Cognos Analytics 11 - Reporting. Through lectures, demonstrations, and exercises, participants increase their IBM Cognos Analytics experience by building highly interactive reports using Active Report controls, which can then be distributed to and consumed by users in a disconnected environment, including on mobile devices.

Impact on Perception of Employers When You Have IBM C2090-621 Certification

There are chances to enter the job market without getting certified, but it is not enough in the modern scenario. You have to add something to the resume that makes you reach out to others and is more noticeable.

Getting IBM C2090-621 certification is not just beneficial in the process of hiring but goes way beyond this.

If you are looking forward to entering in networking field and make a remarkable career, then the following are the advantages that you can get from IBM C2090-621 certification:

1. Effective Non-technical Skills:

Technical skills are not the limitation of IBM C2090-621 certification. Candidates are also shown to innovative thinking and problem-solving. You need to go for something creative if the topologies do not offer tested and tried results. It also includes teamwork communication with members of the team. These all are the skills that always live higher on the wish list of employers.

2. Bigger Technology Picture:

When you enter a job, there are chances that just 1 or 2 technologies will be there, which you direct will slender. However, this is not the problem when you get certification since various technologies and topics can get exposure. This helps give more expertise and a broader perspective that will help thwart your career. You will also be able to get hands-on experience with the latest trends in development and innovation that make your career shine.

3. Accelerate Fast on Growth Track:

The promotion process in your job is accelerated when you have IBM C2090-621 certification, which enables you to enjoy much higher hikes in salary.

4. Continue with Learning:

You can go as far as your expectation with the aid of IBM C2090-621 certification. This certification is the industry’s most prestigious one, which helps you pursue and develop on the professional front, keep alive your learning and stay on track with the constant technology and innovations updates.

5. Be Prominent from the Crowd:

There is intensifying competition in the market for jobs, and recruiters' task is to narrow the talent pool. When the applicants are separated by hiring managers, they get attracted to the candidates who have reflected staying power, initiative, and judgment for certification completion. Your passion is also demonstrated with this about the subject along with the sticking capability athwart the project.

6. Preference to C2090-621 Certified Candidates by Employers:

It is found in the hiring manager’s survey that the performance of certified employees is better than those who do not hold any certification. As per several hiring managers and recruiters, IBM C2090-621 certification serves to be an excellent method to enter the industry and reflect your passion for the subject matter.

Summary

Higher-level in IBM Certification means a higher salary plus various job opportunities. The cost of the IBM C2090-621 certification program is affordable compared to other certifications and the related prospects. The IBM C2090-621 certification gives tremendous value to any person who wants to achieve the highest level of success in the networking and IT industry.

Sunday 9 May 2021

IBM

New certification: IBM Certified Developer – Business Automation Workflow V20.0.0.2 using Workflow Center

IBM Exam Prep, IBM Certification, IBM Learning, IBM Preparation, IBM Career

IBM is pleased to announce the release of a new certification – IBM Certified Developer – Business Automation Workflow V20.0.0.2 using Workflow Center.

This intermediate level certification is intended for BAW Developers who lead and contribute to the delivery of complex process and case applications.

These developers implement high quality process-driven and case-driven business solutions using IBM Business Automation Workflow v20.0.0.2. This exam does not include IBM Integration Designer, Filenet P8 Process Designer or Business Automation Studio application service features.

These developers are expected to be generally self-sufficient and be able to perform the tasks involved in the role with occasional assistance from peers, product documentation, or vendor support services. Practical experience and in-depth product knowledge are required to ensure success on the test.

To attain the IBM Certified Developer – Business Automation Workflow V20.0.0.2 using Workflow Center certification, candidates must pass C1000-116, IBM Business Automation Workflow V20.0.0.2 using Workflow Center Development.

IBM Exam Prep, IBM Certification, IBM Learning, IBM Preparation, IBM Career
This exam consists of the seven sections:

◉ Architecture

◉ Workflow Development

◉ User Interface Development

◉ Service Development

◉ Document Management

◉ Error Handling, Debugging and Troubleshooting

◉ Deployment and Governance

Source: ibm.com

Thursday 6 May 2021

Mimicking the brain: Deep learning meets vector-symbolic AI

IBM Exam Prep, IBM Certification, IBM Preparation, IBM Career, IBM Tutorial and Material

Machines have been trying to mimic the human brain for decades. But neither the original, symbolic AI that dominated machine learning research until the late 1980s nor its younger cousin, deep learning, have been able to fully simulate the intelligence it’s capable of.

One promising approach towards this more general AI is in combining neural networks with symbolic AI. In our paper “Robust High-dimensional Memory-augmented Neural Networks” published in Nature Communications, we present a new idea linked to neuro-symbolic AI, based on vector-symbolic architectures.

We’ve relied on the brain’s high-dimensional circuits and the unique mathematical properties of high-dimensional spaces. Specifically, we wanted to combine the learning representations that neural networks create with the compositionality of symbol-like entities, represented by high-dimensional and distributed vectors. The idea is to guide a neural network to represent unrelated objects with dissimilar high-dimensional vectors.

In the paper, we show that a deep convolutional neural network used for image classification can learn from its own mistakes to operate with the high-dimensional computing paradigm, using vector-symbolic architectures. It does so by gradually learning to assign dissimilar, such as quasi-orthogonal, vectors to different image classes, mapping them far away from each other in the high-dimensional space. More importantly, it never runs out of such dissimilar vectors.

IBM Exam Prep, IBM Certification, IBM Preparation, IBM Career, IBM Tutorial and Material

This directed mapping helps the system to use high-dimensional algebraic operations for richer object manipulations, such as variable binding — an open problem in neural networks. When these “structured” mappings are stored in the AI’s memory (referred to as explicit memory), they help the system learn—and learn not only fast but also all the time. The ability to rapidly learn new objects from a few training examples of never-before-seen data is known as few-shot learning.

High-dimensional explicit memory as computational memory


During training and inference using such an AI system, the neural network accesses the explicit memory using expensive soft read and write operations. They involve every individual memory entry instead of a single discrete entry.

These soft reads and writes form a bottleneck when implemented in the conventional von Neumann architectures (e.g., CPUs and GPUs), especially for AI models demanding over millions of memory entries. Thanks to the high-dimensional geometry of our resulting vectors, their real-valued components can be approximated by binary, or bipolar components, taking up less storage. More importantly, this opens the door for efficient realization using analog in-memory computing.

Such transformed binary high-dimensional vectors are stored in a computational memory unit, comprising a crossbar array of memristive devices. A single nanoscale memristive device is used to represent each component of the high-dimensional vector that leads to a very high-density memory. The similarity search on these wide vectors can be efficiently computed by exploiting physical laws such as Ohm’s law and Kirchhoff’s current summation law.

This approach was experimentally verified for a few-shot image classification task involving a dataset of 100 classes of images with just five training examples per class. Although operating with 256,000 noisy nanoscale phase-change memristive devices, there was just a 2.7 percent accuracy drop compared to the conventional software realizations in high precision.

IBM Exam Prep, IBM Certification, IBM Preparation, IBM Career, IBM Tutorial and Material
We believe that our results are the first step to direct learning representations in the neural networks towards symbol-like entities that can be manipulated by high-dimensional computing. Such an approach facilitates fast and lifelong learning and paves the way for high-level reasoning and manipulation of objects.

The ultimate goal, though, is to create intelligent machines able to solve a wide range of problems by reusing knowledge and being able to generalize in predictable and systematic ways. Such machine intelligence would be far superior to the current machine learning algorithms, typically aimed at specific narrow domains.


Source: ibm.com

Tuesday 4 May 2021

Can your storage infrastructure support hybrid cloud?

IBM Exam Prep, IBM Learning, IBM Preparation, IBM Career, IBM Tutorial and Material

The paradigm is shifting. The days of the monolithic application are dwindling in favor of applications built using the microservice model. Applications built upon the microservice model using containers are referred to as “cloud-native” and are the future of application development and delivery. For this very reason, organizations are deploying on-premises clouds to leverage this new application development and delivery platform. In many cases, organizations connect their on-premises cloud (also known as private cloud) to a public cloud provider, creating a hybrid cloud to take advantage of the flexibility and agility of workload placement.

In Cloud 1.0, the early days of cloud, the focus was around the cloud service models of IaaS, PaaS, and SaaS. This included the cloud consumption model of “pay only for what you consume”, also known as metered billing, and the OPEX nature of these cloud services. Summary: cloud 1.0 was about a new service delivery and consumption model.

In Cloud 2.0, the present day of cloud, the focus has shifted to the agility and flexibility of application development, application delivery and workload placement with an understanding that features of Cloud 1.0 are still desired and sought after. Summary: cloud 2.0 is about a new application development and delivery platform.

IBM Exam Prep, IBM Learning, IBM Preparation, IBM Career, IBM Tutorial and Material
Knowing that on-premises cloud is a software stack that resides on clustered servers and relies on the underlying storage infrastructure, and knowing that user expectations of cloud native applications are that they are “always available, anywhere, at any time”, the questions for many in organizations who are contemplating implementing on-premises cloud or about to implement an on-premises cloud are: “Can my current storage infrastructure support on-premises cloud?” and “If not, what items do I need to address? …or… What actions do I need to take?”

Good news: IBM Lab Services has built the Hybrid Cloud Acceleration (HCA) assessment to answer those exact questions. For organizations that would like consultative assistance with deep storage infrastructure and cloud expertise, the HCA offering is a great vehicle to ensure your storage infrastructure is prepared for the onboarding of an on-premises cloud stack and can support hybrid cloud functionality.

Source: ibm.com