talk-data.com talk-data.com

Topic

Cyber Security

cybersecurity information_security data_security privacy

2078

tagged

Activity Trend

297 peak/qtr
2020-Q1 2026-Q1

Activities

2078 activities · Newest first

Securing SQL Server: DBAs Defending the Database

Protect your data from attack by using SQL Server technologies to implement a defense-in-depth strategy, performing threat analysis, and encrypting sensitive data as a last line of defense against compromise. The multi-layered approach in this book helps ensure that a single breach doesn't lead to loss or compromise of your data that is confidential and important to the business. Database professionals in today's world deal increasingly often with repeated data attacks against high-profile organizations and sensitive data. It is more important than ever to keep your company's data secure. demonstrates how administrators and developers can both play their part in the protection of a SQL Server environment. Securing SQL Server This book provides a comprehensive technical guide to the security model, and to encryption within SQL Server, including coverage of the latest security technologies such as Always Encrypted, Dynamic Data Masking, and Row Level Security. Most importantly, the book gives practical advice and engaging examples on how to defend your data -- and ultimately your job! -- against attack and compromise. Covers the latest security technologies, including Always Encrypted, Dynamic Data Masking, and Row Level Security Promotes security best-practice and strategies for defense-in-depth of business-critical database assets Gives advice on performing threat analysis and reducing the attack surface that your database presents to the outside world What You Will Learn Perform threat analysis Implement access level control and data encryption Avoid non-reputability by implementing comprehensive auditing Use security metadata to ensure your security policies are enforced Apply the latest SQL Server technologies to increase data security Mitigate the risk of credentials being stolen Who This Book Is For SQL Server database administrators who need to understand and counteract the threat of attacks against their company's data. The book is also of interest to database administrators of other platforms, as several of the attack techniques are easily generalized beyond SQL Server and to other database brands.

Securing Your Cloud: IBM z/VM Security for IBM z Systems and LinuxONE

As workloads are being offloaded to IBM® z Systems™ based cloud environments, it is important to ensure that these workloads and environments are secure. This IBM Redbooks® publication describes the necessary steps to secure your environment for all of the components that are involved in a z Systems cloud infrastructure that uses IBM z/VM® and Linux on z Systems. The audience for this book is IT architects and those planning to use z Systems for their cloud environments.

In this session, Dr. Nipa Basu, Chief Analytics Officer, Dun&Bradstreet, sat with Vishal Kumar, CEO AnalyticsWeek and shared her journey as Chief Analytics Officer, life @ D&B, Future of Credit Scoring, and some challenges/opportunities she's observing as an industry observer, executive, and practitioner.

Timeline: 0:29 Nipa's background. 4:14 What is D&B? 7:40 Depth and breadth of decision making at D&B. 9:36 Matching security with technological evolution. 13:42 Anticipatory analytics. 16:00 CAO's role in D&B: in facing or outfacing? 18:32 Future of credit scoring. 21:36 Challenges in dealing with clients. 24:08 Cultural challenges. 28:42 Good use cases in security data. 31:51 CDO, CAO, and CTO. 33:56 Optimistic trends data analytics businesses. 36:44 Social data monitoring. 39:18 Creating a holistic model for data monitoring. 41:02 Overused terms in data analytics. 42:10 Best practices for small businesses to get started with data analytics. 44:33 Indicators that indicate the need for analytics for businesses. 47:06 Advice for data-driven leaders. 49:30 Art of doing business and science of doing business.

Podcast link: https://futureofdata.org/analyticsweek-leadership-podcast-with-dr-nipa-basu-dun-bradstreet/

Here's Nipa's Bio: Dr. Nipa Basu is the Chief Analytics Officer at Dun & Bradstreet. Nipa is the main source of inspiration and leadership for Dun & Bradstreet’s extensive team of data modelers and scientists that partner with the world’s leading Fortune 500 companies to create innovative, analytic solutions to drive business growth and results. The team is highly skilled in solving a wide range of business challenges with unique, basic, and advanced analytic applications.

Nipa joined Dun & Bradstreet in 2000 and since then has held key leadership roles focused on driving the success of Dun & Bradstreet’s Analytics practice. In 2012, Nipa was named Leader, Analytic Development, and in March 2015, Nipa was named Chief Analytics Officer and appointed to Dun & Bradstreet’s executive team.

Nipa began her professional career as an Economist with the New York State Legislative Tax Study Commission. She then joined Sandia National Laboratories, a national defense laboratory where she built a Microsimulation Model of the U.S. Economy. Prior to joining Dun & Bradstreet, Nipa was a database marketing statistician for AT&T with responsibility for building predictive marketing models.

Nipa received her Ph. D. in Economics from the State University of New York at Albany, specializing in Econometrics.

Follow @nipabasu

The podcast is sponsored by: TAO.ai(https://tao.ai), Artificial Intelligence Driven Career Coach

About #Podcast:

FutureOfData podcast is a conversation starter to bring leaders, influencers, and lead practitioners to discuss their journey to create the data-driven future.

Want to Join? If you or any you know wants to join in, Register your interest @ http://play.analyticsweek.com/guest/

Want to sponsor? Email us @ [email protected]

Keywords:

FutureOfData #DataAnalytics #Leadership #Podcast #BigData #Strategy

Microsoft SQL Server 2016: A Beginner's Guide, Sixth Edition, 6th Edition

Up-to-date Microsoft SQL Server 2016 skills made easy! Get up and running on Microsoft SQL Server 2016 in no time with help from this thoroughly revised, practical resource. The book offers thorough coverage of SQL management and development and features full details on the newest business intelligence, reporting, and security features. Filled with new real-world examples and hands-on exercises, Microsoft SQL Server 2016: A Beginner's Guide, Sixth Edition , starts by explaining fundamental relational database system concepts. From there, you will learn how to write Transact-SQL statements, execute simple and complex database queries, handle system administration and security, and use the powerful analysis and BI tools. XML, spatial data, and full-text search are also covered in this step-by-step tutorial. · Revised from the ground up to cover the latest version of SQL Server · Ideal both as a self-study guide and a classroom textbook · Written by a prominent professor and best-selling author

Oracle Database 12c Release 2 Multitenant

Master the Powerful Multitenant Features of Oracle Database 12c Govern a scalable, extensible, and highly available enterprise database environment using the practical information contained in this Oracle Press guide. Written by a team of Oracle Masters, Oracle Database 12c Release 2 Multitenant shows, step-by-step, how to deploy and manage multitenant configurations across IT frameworks of all types and sizes. Find out how to create databases, work with PDBs and CDBs, administer Oracle Net Services, and automate administrative tasks. Backup and recovery, security, and advanced multitenant options are covered in complete detail. Learn how to: • Build high-performance multitenant Oracle databases • Create single-tenant, multitenant, and application containers • Establish network connections and manage services • Handle security using authentication, authorization, and encryption • Back up and restore your mission-critical data • Work with point-in-time recovery and Oracle Flashback • Move data and replicate and clone databases • Work with Oracle’s Resource Manager and Data Guard

IBM PowerVC Version 1.3.1 Introduction and Configuration Including IBM Cloud PowerVC Manager

IBM® Power Virtualization Center (IBM® PowerVC™) is an advanced, enterprise virtualization management offering for IBM Power Systems™. This IBM Redbooks® publication introduces PowerVC and helps you understand its functions, planning, installation, and setup. PowerVC Version 1.3.1 supports both large and small deployments, either by managing IBM PowerVM® that is controlled by the Hardware Management Console (HMC) or by IBM PowerVM Novalink, or by managing PowerKVM directly. With this capability, PowerVC can manage IBM AIX®, IBM i, and Linux workloads that run on IBM POWER® hardware, including IBM PurePower systems. PowerVC is available as a Standard Edition, or as a Cloud PowerVC Manager edition. PowerVC Standard Edition includes the following features and benefits: Virtual image capture, deployment, and management Policy-based virtual machine (VM) placement to improve use Management of real-time optimization and VM resilience to increase productivity VM Mobility with placement policies to reduce the burden on IT staff in a simple-to-install and easy-to-use graphical user interface (GUI) Role-based security policies to ensure a secure environment for common tasks IBM Cloud PowerVC Manager includes all of the PowerVC Standard Edition features and adds: A self-service portal that enables user access to the cloud infrastructure on a per-project basis The ability to enable an administrator to enable Dynamic Resource Optimization on a schedule This publication is for experienced users of IBM PowerVM and other virtualization solutions who want to understand and implement the next generation of enterprise virtualization management for Power Systems. Unless stated otherwise, the content of this publication refers to IBM PowerVC Version 1.3.1.

The Global Impact of Open Data

Open data has spurred economic innovation, social transformation, and fresh forms of political and government accountability in recent years, but few people understand how open data works. This comprehensive report, developed with support from Omidyar Network, presents detailed case studies of open data projects throughout the world, along with in-depth analysis of what works and what doesn’t. Authors Andrew Young and Stefaan Verhulst, both with The GovLab at New York University, explain how these projects have made governments more accountable and efficient, helped policymakers find solutions to previously intractable public problems, created new economic opportunities, and empowered citizens through new forms of social mobilization. This report includes: Recommendations and implementation steps for policymakers, entrepreneurs, and activists looking to leverage open data Key challenges, such as resource shortages and inadequate privacy or security protections Four conditions that enable open data to work—including organizational partnerships and collaborations Case studies of open data projects for improving government in Brazil, Sweden, Slovakia, and other countries Projects for empowering citizens in Tanzania, Kenya, Mexico, and Uruguay New business opportunities enabled by open weather, geo-location, and market research data Public problem-solving efforts built on open data for Ebola in Sierra Leone, dengue fever in Singapore, and earthquakes in New Zealand Andrew Young (@_AndrewYoung) is the Associate Director of Research at The GovLab (www.thegovlab.org), where he leads a number of grant-funded research efforts focusing on the impact of technology on public institutions. He is also the Network Coordinator of the GovLab-chaired MacArthur Foundation Research Network on Opening Governance. Stefaan G. Verhulst (@sverhulst) is the Co-Founder and Chief R&D Officer of The GovLab at New York University’s Tandon School of Engineering, responsible for experimentation and evidence gathering on how to transform governance by using advances in science and technology. He was Chief of Research for the Markle Foundation, where he continues to serve as Senior Advisor.

Mobile Security and Privacy

Mobile Security and Privacy: Advances, Challenges and Future Research Directions provides the first truly holistic view of leading edge mobile security research from Dr. Man Ho Au and Dr. Raymond Choo—leading researchers in mobile security. Mobile devices and apps have become part of everyday life in both developed and developing countries. As with most evolving technologies, mobile devices and mobile apps can be used for criminal exploitation. Along with the increased use of mobile devices and apps to access and store sensitive, personally identifiable information (PII) has come an increasing need for the community to have a better understanding of the associated security and privacy risks. Drawing upon the expertise of world-renowned researchers and experts, this volume comprehensively discusses a range of mobile security and privacy topics from research, applied, and international perspectives, while aligning technical security implementations with the most recent developments in government, legal, and international environments. The book does not focus on vendor-specific solutions, instead providing a complete presentation of forward-looking research in all areas of mobile security. The book will enable practitioners to learn about upcoming trends, scientists to share new directions in research, and government and industry decision-makers to prepare for major strategic decisions regarding implementation of mobile technology security and privacy. In addition to the state-of-the-art research advances, this book also discusses prospective future research topics and open challenges. Presents the most current and leading edge research on mobile security and privacy, featuring a panel of top experts in the field Provides a strategic and international overview of the security issues surrounding mobile technologies Covers key technical topics and provides readers with a complete understanding of the most current research findings along with future research directions and challenges Enables practitioners to learn about upcoming trends, scientists to share new directions in research, and government and industry decision-makers to prepare for major strategic decisions regarding the implementation of mobile technology security and privacy initiatives

Data Hiding Techniques in Windows OS

"This unique book delves down into the capabilities of hiding and obscuring data object within the Windows Operating System. However, one of the most noticeable and credible features of this publication is, it takes the reader from the very basics and background of data hiding techniques, and run’s on the reading-road to arrive at some of the more complex methodologies employed for concealing data object from the human eye and/or the investigation. As a practitioner in the Digital Age, I can see this book siting on the shelves of Cyber Security Professionals, and those working in the world of Digital Forensics – it is a recommended read, and is in my opinion a very valuable asset to those who are interested in the landscape of unknown unknowns. This is a book which may well help to discover more about that which is not in immediate view of the onlooker, and open up the mind to expand its imagination beyond its accepted limitations of known knowns." - John Walker, CSIRT/SOC/Cyber Threat Intelligence Specialist Featured in Digital Forensics Magazine, February 2017 In the digital world, the need to protect online communications increase as the technology behind it evolves. There are many techniques currently available to encrypt and secure our communication channels. Data hiding techniques can take data confidentiality to a new level as we can hide our secret messages in ordinary, honest-looking data files. Steganography is the science of hiding data. It has several categorizations, and each type has its own techniques in hiding. Steganography has played a vital role in secret communication during wars since the dawn of history. In recent days, few computer users successfully manage to exploit their Windows® machine to conceal their private data. Businesses also have deep concerns about misusing data hiding techniques. Many employers are amazed at how easily their valuable information can get out of their company walls. In many legal cases a disgruntled employee would successfully steal company private data despite all security measures implemented using simple digital hiding techniques. Human right activists who live in countries controlled by oppressive regimes need ways to smuggle their online communications without attracting surveillance monitoring systems, continuously scan in/out internet traffic for interesting keywords and other artifacts. The same applies to journalists and whistleblowers all over the world. Computer forensic investigators, law enforcements officers, intelligence services and IT security professionals need a guide to tell them where criminals can conceal their data in Windows® OS & multimedia files and how they can discover concealed data quickly and retrieve it in a forensic way. Data Hiding Techniques in Windows OS is a response to all these concerns. Data hiding topics are usually approached in most books using an academic method, with long math equations about how each hiding technique algorithm works behind the scene, and are usually targeted at people who work in the academic arenas. This book teaches professionals and end users alike how they can hide their data and discover the hidden ones using a variety of ways under the most commonly used operating system on earth, Windows®.

SAS Data Analytic Development

Design quality SAS software and evaluate SAS software quality SAS Data Analytic Development is the developer’s compendium for writing better-performing software and the manager’s guide to building comprehensive software performance requirements. The text introduces and parallels the International Organization for Standardization (ISO) software product quality model, demonstrating 15 performance requirements that represent dimensions of software quality, including: reliability, recoverability, robustness, execution efficiency (i.e., speed), efficiency, scalability, portability, security, automation, maintainability, modularity, readability, testability, stability, and reusability. The text is intended to be read cover-to-cover or used as a reference tool to instruct, inspire, deliver, and evaluate software quality. A common fault in many software development environments is a focus on functional requirements—the what and how—to the detriment of performance requirements, which specify instead how well software should function (assessed through software execution) or how easily software should be maintained (assessed through code inspection). Without the definition and communication of performance requirements, developers risk either building software that lacks intended quality or wasting time delivering software that exceeds performance objectives—thus, either underperforming or gold-plating, both of which are undesirable. Managers, customers, and other decision makers should also understand the dimensions of software quality both to define performance requirements at project outset as well as to evaluate whether those objectives were met at software completion. As data analytic software, SAS transforms data into information and ultimately knowledge and data-driven decisions. Not surprisingly, data quality is a central focus and theme of SAS literature; however, code quality is far less commonly described and too often references only the speed or efficiency with which software should execute, omitting other critical dimensions of software quality. SAS® software project definitions and technical requirements often fall victim to this paradox, in which rigorous quality requirements exist for data and data products yet not for the software that undergirds them. By demonstrating the cost and benefits of software quality inclusion and the risk of software quality exclusion, stakeholders learn to value, prioritize, implement, and evaluate dimensions of software quality within risk management and project management frameworks of the software development life cycle (SDLC). Thus, SAS Data Analytic Development recalibrates business value, placing code quality on par with data quality, and performance requirements on par with functional requirements.

IBM Data Engine for Hadoop and Spark

This IBM® Redbooks® publication provides topics to help the technical community take advantage of the resilience, scalability, and performance of the IBM Power Systems™ platform to implement or integrate an IBM Data Engine for Hadoop and Spark solution for analytics solutions to access, manage, and analyze data sets to improve business outcomes. This book documents topics to demonstrate and take advantage of the analytics strengths of the IBM POWER8® platform, the IBM analytics software portfolio, and selected third-party tools to help solve customer's data analytic workload requirements. This book describes how to plan, prepare, install, integrate, manage, and show how to use the IBM Data Engine for Hadoop and Spark solution to run analytic workloads on IBM POWER8. In addition, this publication delivers documentation to complement available IBM analytics solutions to help your data analytic needs. This publication strengthens the position of IBM analytics and big data solutions with a well-defined and documented deployment model within an IBM POWER8 virtualized environment so that customers have a planned foundation for security, scaling, capacity, resilience, and optimization for analytics workloads. This book is targeted at technical professionals (analytics consultants, technical support staff, IT Architects, and IT Specialists) that are responsible for delivering analytics solutions and support on IBM Power Systems.

Real World SQL and PL/SQL: Advice from the Experts

Master the Underutilized Advanced Features of SQL and PL/SQL This hands-on guide from Oracle Press shows how to fully exploit lesser known but extremely useful SQL and PL/SQL features―and how to effectively use both languages together. Written by a team of Oracle ACE Directors, Real-World SQL and PL/SQL: Advice from the Experts features best practices, detailed examples, and insider tips that clearly demonstrate how to write, troubleshoot, and implement code for a wide variety of practical applications. The book thoroughly explains underutilized SQL and PL/SQL functions and lays out essential development strategies. Data modeling, advanced analytics, database security, secure coding, and administration are covered in complete detail. Learn how to: • Apply advanced SQL and PL/SQL tools and techniques • Understand SQL and PL/SQL functionality and determine when to use which language • Develop accurate data models and implement business logic • Run PL/SQL in SQL and integrate complex datasets • Handle PL/SQL instrumenting and profiling • Use Oracle Advanced Analytics and Oracle R Enterprise • Build and execute predictive queries • Secure your data using encryption, hashing, redaction, and masking • Defend against SQL injection and other code-based attacks • Work with Oracle Virtual Private Database Code examples in the book are available for download at www.MHProfessional.com. TAG: For a complete list of Oracle Press titles, visit www.OraclePressBooks.com

IBM TS4500 R3 Tape Library Guide

The IBM® TS4500 tape library is a next-generation tape solution that offers higher storage density and integrated management than previous solutions. This IBM Redbooks® publication gives you a close-up view of the new IBM TS4500 tape library. In the TS4500, IBM delivers the density that today's and tomorrow's data growth require, with the cost-effectiveness and the manageability to grow with business data needs, while you preserve existing investments in IBM tape library products. Now, you can achieve both a low cost per terabyte (TB) and a high TB density per square foot because the TS4500 can store up to 5.5 petabytes (PBs) of data in a single 10-square foot library frame, which is up to 3.4 times more capacity than the IBM TS3500 tape library. The TS4500 offers these benefits: High availability dual active accessors with integrated service bays to reduce inactive service space by 40%. The Elastic Capacity option can be used to completely eliminate inactive service space. Flexibility to grow: The TS4500 library can grow from both the right side and the left side of the first L frame because models can be placed in any active position. Increased capacity: The TS4500 can grow from a single L frame up to an additional 17 expansion frames with a capacity of over 23,000 cartridges. High-density (HD) generation 1 frames from the existing TS3500 library can be redeployed in a TS4500. Capacity on demand (CoD): CoD is supported through entry-level, intermediate, and base-capacity configurations. Advanced Library Management System (ALMS): ALMS supports dynamic storage management, which enables users to create and change logical libraries and configure any drive for any logical library. Support for the IBM TS1150 tape drive: The TS1150 gives organizations an easy way to deliver fast access to data, improve security, and provide long-term retention, all at a lower cost than disk solutions. The TS1150 offers high-performance, flexible data storage with support for data encryption. Also, this fifth-generation drive can help protect investments in tape automation by offering compatibility with existing automation. Support of the IBM Linear Tape-Open (LTO) Ultrium 7 tape drive: The LTO Ultrium 7 offering represents significant improvements in capacity, performance, and reliability over the previous generation, LTO Ultrium 6, while they still protect your investment in the previous technology. Integrated TS7700 back-end Fibre Channel (FC) switches are available. Up to four library-managed encryption (LME) key paths per logical library are available. This book describes the TS4500 components, feature codes, specifications, supported tape drives, encryption, new integrated management console (IMC), and command-line interface (CLI). You learn how to accomplish several specific tasks: Improve storage density with increased expansion frame capacity up to 2.4 times and support 33% more tape drives per frame. Manage storage by using the ALMS feature. Improve business continuity and disaster recovery with dual active accessor, automatic control path failover, and data path failover. Help ensure security and regulatory compliance with tape-drive encryption and Write Once Read Many (WORM) media. Support IBM LTO Ultrium 7, 6, and 5, IBM TS1150, and TS1140 tape drives. Provide a flexible upgrade path for users who want to expand their tape storage as their needs grow. Reduce the storage footprint and simplify cabling with 10 U of rack space on top of the library. This guide is for anyone who wants to understand more about the IBM TS4500 tape library. It is particularly suitable for IBM clients, IBM Business Partners, IBM specialist sales representatives, and technical specialists.

In Search of Database Nirvana

The database pendulum is in full swing. Ten years ago, web-scale companies began moving away from proprietary relational databases to handle big data use cases with NoSQL and Hadoop. Now, for a variety of reasons, the pendulum is swinging back toward SQL-based solutions. What many companies really want is a system that can handle all of their operational, OLTP, BI, and analytic workloads. Could such an all-in-one database exist? This O’Reilly report examines this quest for database nirvana, or what Gartner recently dubbed Hybrid Transaction/Analytical Processing (HTAP). Author Rohit Jain takes an in-depth look at the possibilities and the challenges for companies that long for a single query engine to rule them all. With this report, you’ll explore: The challenges of having one query engine support operational, BI, and analytical workloads Efforts to produce a query engine that supports multiple storage engines Attempts to support multiple data models with the same query engine Why an HTAP database engine needs to provide enterprise-caliber capabilities, including high availability, security, and manageability How to assess various options for meeting workload requirements with one database engine, or a combination of query and storage engines

Practical Hadoop Migration: How to Integrate Your RDBMS with the Hadoop Ecosystem and Re-Architect Relational Applications to NoSQL

Re-architect relational applications to NoSQL, integrate relational database management systems with the Hadoop ecosystem, and transform and migrate relational data to and from Hadoop components. This book covers the best-practice design approaches to re-architecting your relational applications and transforming your relational data to optimize concurrency, security, denormalization, and performance. Winner of IBM's 2012 Gerstner Award for his implementation of big data and data warehouse initiatives and author of Practical Hadoop Security, author Bhushan Lakhe walks you through the entire transition process. First, he lays out the criteria for deciding what blend of re-architecting, migration, and integration between RDBMS and HDFS best meets your transition objectives. Then he demonstrates how to design your transition model. Lakhe proceeds to cover the selection criteria for ETL tools, the implementation steps for migration with SQOOP- and Flume-based data transfers, and transition optimization techniques for tuning partitions, scheduling aggregations, and redesigning ETL. Finally, he assesses the pros and cons of data lakes and Lambda architecture as integrative solutions and illustrates their implementation with real-world case studies. Hadoop/NoSQL solutions do not offer by default certain relational technology features such as role-based access control, locking for concurrent updates, and various tools for measuring and enhancing performance. Practical Hadoop Migration shows how to use open-source tools to emulate such relational functionalities in Hadoop ecosystem components. What You'll Learn Decide whether you should migrate your relational applications to big data technologies or integrate them Transition your relational applications to Hadoop/NoSQL platforms in terms of logical design and physical implementation Discover RDBMS-to-HDFS integration, data transformation, and optimization techniques Consider when to use Lambda architecture and data lake solutions Select and implement Hadoop-based components and applications to speed transition, optimize integrated performance, and emulate relational functionalities Who This Book Is For Database developers, database administrators, enterprise architects, Hadoop/NoSQL developers, and IT leaders. Its secondary readership is project and program managers and advanced students of database and management information systems.

Enabling Real-time Analytics on IBM z Systems Platform

Regarding online transaction processing (OLTP) workloads, IBM® z Systems™ platform, with IBM DB2®, data sharing, Workload Manager (WLM), geoplex, and other high-end features, is the widely acknowledged leader. Most customers now integrate business analytics with OLTP by running, for example, scoring functions from transactional context for real-time analytics or by applying machine-learning algorithms on enterprise data that is kept on the mainframe. As a result, IBM adds investment so clients can keep the complete lifecycle for data analysis, modeling, and scoring on z Systems control in a cost-efficient way, keeping the qualities of services in availability, security, reliability that z Systems solutions offer. Because of the changed architecture and tighter integration, IBM has shown, in a customer proof-of-concept, that a particular client was able to achieve an orders-of-magnitude improvement in performance, allowing that client’s data scientist to investigate the data in a more interactive process. Open technologies, such as Predictive Model Markup Language (PMML) can help customers update single components instead of being forced to replace everything at once. As a result, you have the possibility to combine your preferred tool for model generation (such as SAS Enterprise Miner or IBM SPSS® Modeler) with a different technology for model scoring (such as Zementis, a company focused on PMML scoring). IBM SPSS Modeler is a leading data mining workbench that can apply various algorithms in data preparation, cleansing, statistics, visualization, machine learning, and predictive analytics. It has over 20 years of experience and continued development, and is integrated with z Systems. With IBM DB2 Analytics Accelerator 5.1 and SPSS Modeler 17.1, the possibility exists to do the complete predictive model creation including data transformation within DB2 Analytics Accelerator. So, instead of moving the data to a distributed environment, algorithms can be pushed to the data, using cost-efficient DB2 Accelerator for the required resource-intensive operations. This IBM Redbooks® publication explains the overall z Systems architecture, how the components can be installed and customized, how the new IBM DB2 Analytics Accelerator loader can help efficient data loading for z Systems data and external data, how in-database transformation, in-database modeling, and in-transactional real-time scoring can be used, and what other related technologies are available. This book is intended for technical specialists and architects, and data scientists who want to use the technology on the z Systems platform. Most of the technologies described in this book require IBM DB2 for z/OS®. For acceleration of the data investigation, data transformation, and data modeling process, DB2 Analytics Accelerator is required. Most value can be archived if most of the data already resides on z Systems platforms, although adding external data (like from social sources) poses no problem at all.

IBM Netcool Operations Insight Version 1.4: Deployment Guide

IBM® Netcool® Operations Insight integrates infrastructure and operations management into a single coherent structure across business applications, virtualized servers, network devices and protocols, internet protocols, and security and storage devices. This IBM Redbooks® publication will help you install, tailor, and configure Netcool Operations Insight Version 1.4. Netcool Operations Insight consists of several products and components that can be installed on many servers in many combinations. You must make many decisions, both critical and personal preference. The purpose of this document is to accelerate the initial deployment of Netcool Operations Insight by making preferred practice choices. The target audience of this book is Netcool Operations Insight deployment specialists.

Perspectives on Data Science for Software Engineering

Perspectives on Data Science for Software Engineering presents the best practices of seasoned data miners in software engineering. The idea for this book was created during the 2014 conference at Dagstuhl, an invitation-only gathering of leading computer scientists who meet to identify and discuss cutting-edge informatics topics. At the 2014 conference, the concept of how to transfer the knowledge of experts from seasoned software engineers and data scientists to newcomers in the field highlighted many discussions. While there are many books covering data mining and software engineering basics, they present only the fundamentals and lack the perspective that comes from real-world experience. This book offers unique insights into the wisdom of the community’s leaders gathered to share hard-won lessons from the trenches. Ideas are presented in digestible chapters designed to be applicable across many domains. Topics included cover data collection, data sharing, data mining, and how to utilize these techniques in successful software projects. Newcomers to software engineering data science will learn the tips and tricks of the trade, while more experienced data scientists will benefit from war stories that show what traps to avoid. Presents the wisdom of community experts, derived from a summit on software analytics Provides contributed chapters that share discrete ideas and technique from the trenches Covers top areas of concern, including mining security and social data, data visualization, and cloud-based data Presented in clear chapters designed to be applicable across many domains

Global Dynamics

A world model: economies, trade, migration, security and development aid. This bookprovides the analytical capability to understand and explore the dynamics of globalisation. It is anchored in economic input-output models of over 200 countries and their relationships through trade, migration, security and development aid. The tools of complexity science are brought to bear and mathematical and computer models are developed both for the elements and for an integrated whole. Models are developed at a variety of scales ranging from the global and international trade through a European model of inter-sub-regional migration to piracy in the Gulf and the London riots of 2011. The models embrace the changing technology of international shipping, the impacts of migration on economic development along with changing patterns of military expenditure and development aid. A unique contribution is the level of spatial disaggregation which presents each of 200+ countries and their mutual interdependencies – along with some finer scale analyses of cities and regions. This is the first global model which offers this depth of detail with fully work-out models, these provide tools for policy making at national, European and global scales. Global dynamics: Presents in depth models of global dynamics. Provides a world economic model of 200+ countries and their interactions through trade, migration, security and development aid. Provides pointers to the deployment of analytical capability through modelling in policy development. Features a variety of models that constitute a formidable toolkit for analysis and policy development. Offers a demonstration of the practicalities of complexity science concepts. This book is for practitioners and policy analysts as well as those interested in mathematical model building and complexity science as well as advanced undergraduate and postgraduate level students.

Introducing Microsoft SQL Server 2016: Mission-Critical Applications, Deeper Insights, Hyperscale Cloud

With Microsoft SQL Server 2016, a variety of new features and enhancements to the data platform deliver breakthrough performance, advanced security, and richer, integrated reporting and analytics capabilities. In this ebook, we introduce new security features: Always Encrypted, Row-Level Security, and dynamic data masking; discuss enhancements that enable you to better manage performance and storage: TemDB configuration, query store, and Stretch Database; review several improvements to Reporting Services; and also describe AlwaysOn Availability Groups, tabular enhancements, and R integration.