talk-data.com talk-data.com

Topic

storage-repositories

100

tagged

Activity Trend

1 peak/qtr
2020-Q1 2026-Q1

Activities

100 activities · Newest first

Leveraging DB2 10 for High Performance of Your Data Warehouse

Building on the business intelligence (BI) framework and capabilities that are outlined in InfoSphere Warehouse: A Robust Infrastructure for Business Intelligence, SG24-7813, this IBM® Redbooks® publication focuses on the new business insight challenges that have arisen in the last few years and the new technologies in IBM DB2® 10 for Linux, UNIX, and Windows that provide powerful analytic capabilities to meet those challenges. This book is organized in to two parts. The first part provides an overview of data warehouse infrastructure and DB2 Warehouse, and outlines the planning and design process for building your data warehouse. The second part covers the major technologies that are available in DB2 10 for Linux, UNIX, and Windows. We focus on functions that help you get the most value and performance from your data warehouse. These technologies include database partitioning, intrapartition parallelism, compression, multidimensional clustering, range (table) partitioning, data movement utilities, database monitoring interfaces, infrastructures for high availability, DB2 workload management, data mining, and relational OLAP capabilities. A chapter on BLU Acceleration gives you all of the details about this exciting DB2 10.5 innovation that simplifies and speeds up reporting and analytics. Easy to set up and self-optimizing, BLU Acceleration eliminates the need for indexes, aggregates, or time-consuming database tuning to achieve top performance and storage efficiency. No SQL or schema changes are required to take advantage of this breakthrough technology. This book is primarily intended for use by IBM employees, IBM clients, and IBM Business Partners.

The Definitive Guide to Warehousing: Managing the Storage and Handling of Materials and Products in the Supply Chain

This is the most authoritative and complete guide to planning, implementing, measuring, and optimizing world-class supply chain warehousing processes. Straight from the Council of Supply Chain Management Professionals (CSCMP), it explains each warehousing option, basic warehousing storage and handling operations, strategic planning, and the effects of warehousing design and service decisions on total logistics costs and customer service. This reference introduces crucial concepts including product handling, labor management, warehouse support, and extended value chain processes, facility ownership, planning, and strategy decisions; materials handling; warehouse management systems; Auto-ID, AGVs, and much more. Step by step, The Definitive Guide to Warehousing helps you optimize all facets of warehousing, one of the most pivotal areas of supply chain management. Coverage includes: Basic warehousing management concepts and their essential role in demand fulfillment Key elements, processes, and interactions in warehousing operations management Principles and strategies for effectively planning and managing warehouse operations Principles and strategies for designing materials handling operations in warehousing facilities Critical roles of technology in managing warehouse operations and product flows Best practices for assessing the performance of warehousing operations using standard metrics and frameworks

Cloud Storage Forensics

To reduce the risk of digital forensic evidence being called into question in judicial proceedings, it is important to have a rigorous methodology and set of procedures for conducting digital forensic investigations and examinations. Digital forensic investigation in the cloud computing environment, however, is in infancy due to the comparatively recent prevalence of cloud computing. Cloud Storage Forensics presents the first evidence-based cloud forensic framework. Using three popular cloud storage services and one private cloud storage service as case studies, the authors show you how their framework can be used to undertake research into the data remnants on both cloud storage servers and client devices when a user undertakes a variety of methods to store, upload, and access data in the cloud. By determining the data remnants on client devices, you gain a better understanding of the types of terrestrial artifacts that are likely to remain at the Identification stage of an investigation. Once it is determined that a cloud storage service account has potential evidence of relevance to an investigation, you can communicate this to legal liaison points within service providers to enable them to respond and secure evidence in a timely manner. Learn to use the methodology and tools from the first evidenced-based cloud forensic framework Case studies provide detailed tools for analysis of cloud storage devices using popular cloud storage services Includes coverage of the legal implications of cloud storage forensic investigations Discussion of the future evolution of cloud storage and its impact on digital forensics

Database Cloud Storage

Implement a Centralized Cloud Storage Infrastructure with Oracle Automatic Storage Management Build and manage a scalable, highly available cloud storage solution. Filled with detailed examples and best practices, this Oracle Press guide explains how to set up a complete cloud-based storage system using Oracle Automatic Storage Management. Find out how to prepare hardware, build disk groups, efficiently allocate storage space, and handle security. Database Cloud Storage: The Essential Guide to Oracle Automatic Storage Management shows how to monitor your system, maximize throughput, and ensure consistency across servers and clusters. Set up and configure Oracle Automatic Storage Management Discover and manage disks and establish disk groups Create, clone, and administer Oracle databases Consolidate resources with Oracle Private Database Cloud Control access, encrypt files, and assign user privileges Integrate replication, file tagging, and automatic failover Employ pre-engineered private cloud database consolidation tools Check for data consistency and resync failed disks Code examples in the book are available for download

The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling, 3rd Edition

Updated new edition of Ralph Kimball's groundbreaking book on dimensional modeling for data warehousing and business intelligence! The first edition of Ralph Kimball's The Data Warehouse Toolkit introduced the industry to dimensional modeling, and now his books are considered the most authoritative guides in this space. This new third edition is a complete library of updated dimensional modeling techniques, the most comprehensive collection ever. It covers new and enhanced star schema dimensional modeling patterns, adds two new chapters on ETL techniques, includes new and expanded business matrices for 12 case studies, and more. Authored by Ralph Kimball and Margy Ross, known worldwide as educators, consultants, and influential thought leaders in data warehousing and business intelligence Begins with fundamental design recommendations and progresses through increasingly complex scenarios Presents unique modeling techniques for business applications such as inventory management, procurement, invoicing, accounting, customer relationship management, big data analytics, and more Draws real-world case studies from a variety of industries, including retail sales, financial services, telecommunications, education, health care, insurance, e-commerce, and more Design dimensional databases that are easy to understand and provide fast query response with The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling, 3rd Edition.

Big Data Imperatives: Enterprise 'Big Data' Warehouse, 'BI' Implementations and Analytics

Big Data Imperatives, focuses on resolving the key questions on everyone's mind: Which data matters? Do you have enough data volume to justify the usage? How you want to process this amount of data? How long do you really need to keep it active for your analysis, marketing, and BI applications? Big data is emerging from the realm of one-off projects to mainstream business adoption; however, the real value of big data is not in the overwhelming size of it, but more in its effective use. This book addresses the following big data characteristics: Very large, distributed aggregations of loosely structured data - often incomplete and inaccessible Petabytes/Exabytes of data Millions/billions of people providing/contributing to the context behind the data Flat schema's with few complex interrelationships Involves time-stamped events Made up of incomplete data Includes connections between data elements that must be probabilistically inferred Big Data Imperatives explains 'what big data can do'. It can batch process millions and billions of records both unstructured and structured much faster and cheaper. Big data analytics provide a platform to merge all analysis which enables data analysis to be more accurate, well-rounded, reliable and focused on a specific business capability. Big Data Imperatives describes the complementary nature of traditional data warehouses and big-data analytics platforms and how they feed each other. This book aims to bring the big data and analytics realms together with a greater focus on architectures that leverage the scale and power of big data and the ability to integrate and apply analytics principles to data which earlier was not accessible. This book can also be used as a handbook for practitioners; helping them on methodology,technical architecture, analytics techniques and best practices. At the same time, this book intends to hold the interest of those new to big data and analytics by giving them a deep insight into the realm of big data. What you'll learn Understanding the technology, implementation of big data platforms and their usage for analytics Big data architectures Big data design patterns Implementation best practices Who this book is for This book is designed for IT professionals, data warehousing, business intelligence professionals, data analysis professionals, architects, developers and business users.

Data Warehousing in the Age of Big Data

Data Warehousing in the Age of the Big Data will help you and your organization make the most of unstructured data with your existing data warehouse. As Big Data continues to revolutionize how we use data, it doesn't have to create more confusion. Expert author Krish Krishnan helps you make sense of how Big Data fits into the world of data warehousing in clear and concise detail. The book is presented in three distinct parts. Part 1 discusses Big Data, its technologies and use cases from early adopters. Part 2 addresses data warehousing, its shortcomings, and new architecture options, workloads, and integration techniques for Big Data and the data warehouse. Part 3 deals with data governance, data visualization, information life-cycle management, data scientists, and implementing a Big Data–ready data warehouse. Extensive appendixes include case studies from vendor implementations and a special segment on how we can build a healthcare information factory. Ultimately, this book will help you navigate through the complex layers of Big Data and data warehousing while providing you information on how to effectively think about using all these technologies and the architectures to design the next-generation data warehouse. Learn how to leverage Big Data by effectively integrating it into your data warehouse. Includes real-world examples and use cases that clearly demonstrate Hadoop, NoSQL, HBASE, Hive, and other Big Data technologies Understand how to optimize and tune your current data warehouse infrastructure and integrate newer infrastructure matching data processing workloads and requirements

Developing Cloud Applications with Windows Azure™ Storage

Get the focused, pragmatic guidance you need to build professional cloud applications using Windows Azure Storage. This is one of the few books centered around Storage capabilities, and the author provides essential, expert coverage of the four key services - BLOB, tables, queues, and drives. Developers will gain hands-on insights, including detailed sections on business use cases and guidance for choosing the right storage option for the job. Provides architectural and programming guidance to professional developers and architects proficient with Microsoft Visual Studio, C#, and LINQ Illuminates when and how to use BLOB storage, table storage, queues, and Windows Azure Drive to build, host, and scale applications in Microsoft-managed datacenters Presents business-case context for choosing the right service for your scenario, e.g. readers will compare relational tables to Windows Azure tables to understand benefits and tradeoffs

IBM SmartCloud Storage Access V1.1 Configuration Cookbook

This IBM® Redbooks® publication will help you learn how to build a storage cloud with the IBM SmartCloud™ Storage Access solution, which consists of multiple hardware and software products including IBM Scale Out network Attached Storage (SONAS), IBM Storwize® V7000 Unified, IBM Tivoli® Storage Productivity Center, and more. To get you started, we cover the planning, installation, and configuration for each component. SmartCloud Storage Access is an IBM software product for storage cloud management that enables the cloud service with virtualization and automation. SCSA is the focal point to administer the storage cloud environment providing a self-service provision approach for the storage cloud users. New storage capacity can be easily deployed and accessed by the Internet or intranet. SmartCloud Storage Access also supports simple and fast resource elasticity as user demand changes. With the SmartCloud Storage Access solution, the storage resources will be displayed as unified resource pools with different service levels. Users no longer need to know the exact location of their files and there is no more need to configure the underlying storage subsystems manually. All storage resources are still well monitored and the cloud administrators can easily track the historical storage resource utilization. This publication is intended for anyone who wants to understand more about IBM SmartCloud Storage Access planning, implementation, configuration, and usage. This book is suitable for IBM clients, IBM Business Partners, IBM specialist sales representatives, and technical specialists.

Computation and Storage in the Cloud

Computation and Storage in the Cloud is the first comprehensive and systematic work investigating the issue of computation and storage trade-off in the cloud in order to reduce the overall application cost. Scientific applications are usually computation and data intensive, where complex computation tasks take a long time for execution and the generated datasets are often terabytes or petabytes in size. Storing valuable generated application datasets can save their regeneration cost when they are reused, not to mention the waiting time caused by regeneration. However, the large size of the scientific datasets is a big challenge for their storage. By proposing innovative concepts, theorems and algorithms, this book will help bring the cost down dramatically for both cloud users and service providers to run computation and data intensive scientific applications in the cloud. Covers cost models and benchmarking that explain the necessary tradeoffs for both cloud providers and users Describes several novel strategies for storing application datasets in the cloud Includes real-world case studies of scientific research applications Covers cost models and benchmarking that explain the necessary tradeoffs for both cloud providers and users Describes several novel strategies for storing application datasets in the cloud Includes real-world case studies of scientific research applications

Information Storage and Management: Storing, Managing, and Protecting Digital Information in Classic, Virtualized, and Cloud Environments, Second Edition

The new edition of a bestseller, now revised and update throughout! This new edition of the unparalleled bestseller serves as a full training course all in one and as the world's largest data storage company, EMC is the ideal author for such a critical resource. They cover the components of a storage system and the different storage system models while also offering essential new material that explores the advances in existing technologies and the emergence of the "Cloud" as well as updates and vital information on new technologies. Features a separate section on emerging area of cloud computing Covers new technologies such as: data de-duplication, unified storage, continuous data protection technology, virtual provisioning, FCoE, flash drives, storage tiering, big data, and more Details storage models such as Network Attached Storage (NAS), Storage Area Network (SAN), Object Based Storage along with virtualization at various infrastructure components Explores Business Continuity and Security in physical and virtualized environment Includes an enhanced Appendix for additional information This authoritative guide is essential for getting up to speed on the newest advances in information storage and management.

Data Warehouse Designs

This book presents two data warehouse solutions that deliver significant ROI; market basket analysis, approached as a database design issue rather than a data model issue; and time variance or temporal data, which present past events in their historical context. The former is available to any enterprise with a data warehouse, while the latter provides a simple design that accommodates large data volumes. The text combines these two database designs into one design, which performs market basket analysis of transactions in their historical context.

Cloud and Virtual Data Storage Networking

Written by noted author, blogger, industry analyst, and IT veteran, Greg Schulz, this book covers data storage networks for cloud and virtual environments, from a hardware, software, services, and best practices perspective. Filled with real-world insights, blueprints, and best practices, this vendor- and technology-neutral text provides the tools to achieve efficient, optimized, flexible, scalable, and resilient data storage networking infrastructures. Coverage includes public and private cloud, virtualization, and traditional IT environments.

Data Architecture

Data Architecture: From Zen to Reality explains the principles underlying data architecture, how data evolves with organizations, and the challenges organizations face in structuring and managing their data. Using a holistic approach to the field of data architecture, the book describes proven methods and technologies to solve the complex issues dealing with data. It covers the various applied areas of data, including data modelling and data model management, data quality, data governance, enterprise information management, database design, data warehousing, and warehouse design. This text is a core resource for anyone customizing or aligning data management systems, taking the Zen-like idea of data architecture to an attainable reality. The book presents fundamental concepts of enterprise architecture with definitions and real-world applications and scenarios. It teaches data managers and planners about the challenges of building a data architecture roadmap, structuring the right team, and building a long term set of solutions. It includes the detail needed to illustrate how the fundamental principles are used in current business practice. The book is divided into five sections, one of which addresses the software-application development process, defining tools, techniques, and methods that ensure repeatable results. Data Architecture is intended for people in business management involved with corporate data issues and information technology decisions, ranging from data architects to IT consultants, IT auditors, and data administrators. It is also an ideal reference tool for those in a higher-level education process involved in data or information technology management. Presents fundamental concepts of enterprise architecture with definitions and real-world applications and scenarios Teaches data managers and planners about the challenges of building a data architecture roadmap, structuring the right team, and building a long term set of solutions Includes the detail needed to illustrate how the fundamental principles are used in current business practice

Data Integration Blueprint and Modeling: Techniques for a Scalable and Sustainable Architecture

Making Data Integration Work: How to Systematically Reduce Cost, Improve Quality, and Enhance Effectiveness Today’s enterprises are investing massive resources in data integration. Many possess thousands of point-to-point data integration applications that are costly, undocumented, and difficult to maintain. Data integration now accounts for a major part of the expense and risk of typical data warehousing and business intelligence projects--and, as businesses increasingly rely on analytics, the need for a blueprint for data integration is increasing now more than ever. This book presents the solution: a clear, consistent approach to defining, designing, and building data integration components to reduce cost, simplify management, enhance quality, and improve effectiveness. Leading IBM data management expert Tony Giordano brings together best practices for architecture, design, and methodology, and shows how to do the disciplined work of getting data integration right. Mr. Giordano begins with an overview of the “patterns” of data integration, showing how to build blueprints that smoothly handle both operational and analytic data integration. Next, he walks through the entire project lifecycle, explaining each phase, activity, task, and deliverable through a complete case study. Finally, he shows how to integrate data integration with other information management disciplines, from data governance to metadata. The book’s appendices bring together key principles, detailed models, and a complete data integration glossary. Coverage includes Implementing repeatable, efficient, and well-documented processes for integrating data Lowering costs and improving quality by eliminating unnecessary or duplicative data integrations Managing the high levels of complexity associated with integrating business and technical data Using intuitive graphical design techniques for more effective process and data integration modeling Building end-to-end data integration applications that bring together many complex data sources

DW 2.0: The Architecture for the Next Generation of Data Warehousing

DW 2.0: The Architecture for the Next Generation of Data Warehousing is the first book on the new generation of data warehouse architecture, DW 2.0, by the father of the data warehouse. The book describes the future of data warehousing that is technologically possible today, at both an architectural level and technology level. The perspective of the book is from the top down: looking at the overall architecture and then delving into the issues underlying the components. This allows people who are building or using a data warehouse to see what lies ahead and determine what new technology to buy, how to plan extensions to the data warehouse, what can be salvaged from the current system, and how to justify the expense at the most practical level. This book gives experienced data warehouse professionals everything they need in order to implement the new generation DW 2.0. It is designed for professionals in the IT organization, including data architects, DBAs, systems design and development professionals, as well as data warehouse and knowledge management professionals. First book on the new generation of data warehouse architecture, DW 2.0 Written by the "father of the data warehouse", Bill Inmon, a columnist and newsletter editor of The Bill Inmon Channel on the Business Intelligence Network Long overdue comprehensive coverage of the implementation of technology and tools that enable the new generation of the DW: metadata, temporal data, ETL, unstructured data, and data quality control

Data Warehousing Fundamentals for IT Professionals

Cutting-edge content and guidance from a data warehousing expert—now expanded to reflect field trends Data warehousing has revolutionized the way businesses in a wide variety of industries perform analysis and make strategic decisions. Since the first edition of Data Warehousing Fundamentals, numerous enterprises have implemented data warehouse systems and reaped enormous benefits. Many more are in the process of doing so. Now, this new, revised edition covers the essential fundamentals of data warehousing and business intelligence as well as significant recent trends in the field. The author provides an enhanced, comprehensive overview of data warehousing together with in-depth explanations of critical issues in planning, design, deployment, and ongoing maintenance. IT professionals eager to get into the field will gain a clear understanding of techniques for data extraction from source systems, data cleansing, data transformations, data warehouse architecture and infrastructure, and the various methods for information delivery. This practical Second Edition highlights the areas of data warehousing and business intelligence where high-impact technological progress has been made. Discussions on developments include data marts, real-time information delivery, data visualization, requirements gathering methods, multi-tier architecture, OLAP applications, Web clickstream analysis, data warehouse appliances, and data mining techniques. The book also contains review questions and exercises for each chapter, appropriate for self-study or classroom work, industry examples of real-world situations, and several appendices with valuable information. Specifically written for professionals responsible for designing, implementing, or maintaining data warehousing systems, Data Warehousing Fundamentals presents agile, thorough, and systematic development principles for the IT professional and anyone working or researching in information management.

The Kimball Group Reader: Relentlessly Practical Tools for Data Warehousing and Business Intelligence

An unparalleled collection of recommended guidelines for data warehousing and business intelligence pioneered by Ralph Kimball and his team of colleagues from the Kimball Group. Recognized and respected throughout the world as the most influential leaders in the data warehousing industry, Ralph Kimball and the Kimball Group have written articles covering more than 250 topics that define the field of data warehousing. For the first time, the Kimball Group's incomparable advice, design tips, and best practices have been gathered in this remarkable collection of articles, which spans a decade of data warehousing innovation. Each group of articles is introduced with original commentaries that explain their role in the overall lifecycle methodology developed by the Kimball Group. These practical, hands-on articles are fully updated to reflect current practices and terminology and cover the complete lifecycle—including project planning, requirements gathering, dimensional modeling, ETL, and business intelligence and analytics. This easily referenced collection is nothing less than vital if you are involved with data warehousing or business intelligence in any capacity.

Strategic Data Warehousing

Organization of data warehouses are vital but often ignored aspects of growing enterprises. This work merges technological know-how with managerial practices to show both the business manager and the IT professional how better alignment between data warehouse plans and business strategies can lead to a successful data warehouse adoption that will support the entire infrastructure. More complete than any other text in the field, this resource also addresses the managerial and strategic aspects of data warehouses, offering doable solutions that will allow for the strategic alignment of these warehouses while building them and ensuring that this alignment is sustained.