talk-data.com talk-data.com

Topic

data-engineering

3395

tagged

Activity Trend

1 peak/qtr
2020-Q1 2026-Q1

Activities

3395 activities · Newest first

Microsoft BizTalk Server 2010 Patterns

Microsoft BizTalk Server 2010 Patterns introduces readers to BizTalk Server 2010, providing a comprehensive overview of its core functionalities and advanced development practices. Through real-world examples and detailed step-by-step instructions, this book equips developers with the necessary tools to craft scalable, robust integration solutions that solve modern middleware challenges effectively. What this Book will help me do Understand the architecture and internals of BizTalk Server 2010 and design optimal topologies for various scenarios. Develop and structure BizTalk Server 2010 projects efficiently, including robust unit testing practices. Implement integration solutions using WCF services, Business Activity Monitoring, and the Business Rules Engine effectively. Apply best practices like convoy patterns and dynamic message routing to build scalable and adaptable solutions. Deploy, manage, and troubleshoot BizTalk Server 2010 applications while ensuring ease of future modifications and maintenance. Author(s) Dan Rosanova, a recognized Microsoft BizTalk Server Architecture MVP, shares his expertise honed through years of building and deploying enterprise solutions. With his deep understanding of middleware integration and his clear, no-nonsense writing style, Dan aims to guide developers through mastering BizTalk effectively while leveraging its full potential. Who is it for? This book is ideal for developers and architects tasked with building solutions utilizing BizTalk Server 2010. If you have prior experience with Visual Studio and basic familiarity with BizTalk, you'll find advanced concepts and practical guidance to enhance your skills. Technical managers overseeing BizTalk projects will also benefit from the structured methodologies and foundational patterns presented.

IT Security Policy Management Usage Patterns Using IBM Tivoli Security Policy Manager

In a growing number of organizations, policies are the key mechanism by which the capabilities and requirements of services are expressed and made available to other entities. The goals established and driven by the business need to be consistently implemented, managed and enforced by the service-oriented infrastructure; expressing these goals as policy and effectively managing this policy is fundamental to the success of any IT and application transformation. First, a flexible policy management framework must be in place to achieve alignment with business goals and consistent security implementation. Second, common re-usable security services are foundational building blocks for SOA environments, providing the ability to secure data and applications. Consistent IT Security Services that can be used by different components of an SOA run time are required. Point solutions are not scalable, and cannot capture and express enterprise-wide policy to ensure consistency and compliance. In this IBM® Redbooks® publication, we discuss an IBM Security policy management solution, which is composed of both policy management and enforcement using IT security services. We discuss how this standards-based unified policy management and enforcement solution can address authentication, identity propagation, and authorization requirements, and thereby help organizations demonstrate compliance, secure their services, and minimize the risk of data loss. This book is a valuable resource for security officers, consultants, and architects who want to understand and implement a centralized security policy management and entitlement solution.

IBM IMS Version 12 Technical Overview

IBM® Information Management System (IMS™) provides leadership in performance, reliability, and security to help you implement the most strategic and critical enterprise applications. IMS also keeps pace with the IT industry. IMS, Enterprise Suite 2.1, and IMS Tools continue to evolve to provide value and meet the needs of enterprise customers. With IMS 12, integration and open access improvements provide flexibility and support business growth requirements. Manageability enhancements help optimize system staff productivity by improving ease of use and autonomic computing facilities and by providing increased availability. Scalability improvements have been made to the well-known performance, efficiency, availability, and resilience of IMS by using 64-bit storage. IBM IMS Enterprise Suite for z/OS® V2.1 components enhance the use of IMS applications and data. In this release, components (either orderable or downloaded from the web) deliver innovative new capabilities for your IMS environment. They enhance connectivity, expand application development, extend standards and tools for a service-oriented architecture (SOA), ease installation, and provide simplified interfaces. This IBM Redbooks® publication explores the new features of IMS 12 and Enterprise Suite 2.1 and provides an overview of the IMS tools. In addition, this book highlights the major new functions and facilitates database administrators in their planning for installation and migration.

Effective MySQL Optimizing SQL Statements

The Essential Guide to SQL Statement Optimization Written by Oracle ACE Director and MySQL expert Ronald Bradford, Effective MySQL: Optimizing SQL Statements is filled with detailed explanations and practical examples that can be applied immediately to improve database and application performances. Featuring a step-by-step approach to SQL optimization, this Oracle Press book helps you to analyze and tune problematic SQL statements. Identify the essential analysis commands for gathering and diagnosing issues Learn how different index theories are applied and represented in MySQL Plan and execute informed SQL optimizations Create MySQL indexes to improve query performance Master the MySQL query execution plan Identify key configuration variables that impact SQL execution and performance Apply the SQL optimization lifecycle to capture, identify, confirm, analyze, and optimize SQL statements and verify the results Improve index utilization with covering indexes and partial indexes Learn hidden performance tips for improving index efficiency and simplifying SQL statements

Microsoft SQL Server 2012 Reporting Services 4/E, 4th Edition

The Definitive Guide to Microsoft SQL Server 2012 Reporting Services Create, deploy, and manage business intelligence reports using the expert tips and best practices in this hands-on resource. Written by a member of the original Reporting Services development team, Microsoft SQL Server 2012 Reporting Services, Fourth Edition covers the complete process of building and distributing reports and explains how to maximize all of the powerful, integrated SSRS capabilities, including the new and enhanced features. A detailed case study and sample reports are included in this practical reference. Plan for, install, configure, and customize SQL Server 2012 Reporting Services Retrieve data with SELECT queries Generate reports from the Report Wizard and from scratch Enhance your reports with charts, images, gauges, and maps Add value to reports through summarizing, totaling, and interactivity Build reusable report templates Embed Visual Basic, .NET functions and subreports into your reports Enable end-user access to reports via the Report Server and its Report Manager web interface Integrate SSRS reports with your own websites and custom applications Follow along with sample reports from the book's case study

Oracle Database 11g & MySQL 5.6 Developer Handbook

Master Application Development in a Mixed-Platform Environment Build powerful database applications in a mixed environment using the detailed information in this Oracle Press guide. Oracle Database 11g & MySQL 5.6 Developer Handbook lays out programming strategies and best practices for seamlessly operating between the two platforms. Find out how to migrate databases, port SQL dialects, work with Oracle MySQL databases, and configure effective queries. Security, monitoring, and tuning techniques are also covered in this comprehensive volume. Understand Oracle Database 11 g and MySQL 5.6 architecture Convert databases between platforms and ensure transactional integrity Create tables, sequences, indexes, views, and user accounts Build and debug PL/SQL, SQL*Plus, SQL/PSM, and MySQL Monitor scripts Execute complex queries and handle numeric and date mathematics Merge data from source tables and set up virtual directories

Oracle Hyperion Financial Management Tips And Techniques

Master Oracle Hyperion Financial Management Consolidate financial data and maintain a scalable compliance framework with expert instruction from an Oracle ACE. Oracle Hyperion Financial Management Tips & Techniques provides advanced, time-saving procedures not documented in user manuals or help files. Find out how to configure Oracle Hyperion Financial Management, import and reconcile data, deliver dynamic business reports, and automate administrative tasks. Stragegies for supporting, testing, and tuning your application are also covered in this comprehensive Oracle Press guide. Establish objectives and develop an effective rollout plan Set up and customize Oracle Hyperion Financial Management Create rules using VBScript and the Calculation Manager feature of Oracle Hyperion Foundation Services Load, test, and reconcile your data with Oracle Data Integrator and Oracle Hyperion Financial Data Quality Management Design, update, and distribute Web-based business reports Integrate content from Microsoft Excel, Word, and PowerPoint using SmartView Work with the Lifecycle Management feature of Oracle Hyperion Foundation Services Identify and resolve performance, design, and capacity problems

IBM zEnterprise 196 Technical Guide

The popularity of the Internet and the affordability of IT hardware and software have resulted in an explosion of applications, architectures, and platforms. Workloads have changed. Many applications, including mission-critical ones, are deployed on a variety of platforms, and the System z® design has adapted to this change. It takes into account a wide range of factors, including compatibility and investment protection, to match the IT requirements of an enterprise. The zEnterprise System consists of the IBM zEnterprise 196 central processor complex, the IBM zEnterprise Unified Resource Manager, and the IBM zEnterprise BladeCenter® Extension. The z196 is designed with improved scalability, performance, security, resiliency, availability, and virtualization. The z196 Model M80 provides up to 1.6 times the total system capacity of the z10™ EC Model E64, and all z196 models provide up to twice the available memory of the z10 EC. The zBX infrastructure works with the z196 to enhance System z virtualization and management through an integrated hardware platform that spans mainframe, POWER7™, and System x® technologies. Through the Unified Resource Manager, the zEnterprise System is managed as a single pool of resources, integrating system and workload management across the environment. This IBM® Redbooks® publication provides an overview of the zEnterprise System and its functions, features, and associated software support. Greater detail is offered in areas relevant to technical planning. This book is intended for systems engineers, consultants, planners, and anyone wanting to understand the zEnterprise System functions and plan for their usage. It is not intended as an introduction to mainframes. Readers are expected to be generally familiar with existing IBM System z technology and terminology. The changes to this edition are based on the System z hardware announcement dated July 12, 2011.

Value Realization from Efficient Software Deployment

Unfortunately purchasing software products does not automatically mean that these products are exploited throughout the organization providing the maximum possible value to the business units. Several issues call for a structured approach that gets the most business value out of software already purchased. The objectives of this approach are to: * Create maximum awareness throughout the organization of the software purchased. We can summarize the overall objective of this approach as ensuring that the business units in an organization obtain the maximum possible value of software products purchased, which is also the scope of this IBM Redbooks publication.

Metadata Management with IBM InfoSphere Information Server

What do you know about your data? And how do you know what you know about your data? Information governance initiatives address corporate concerns about the quality and reliability of information in planning and decision-making processes. Metadata management refers to the tools, processes, and environment that are provided so that organizations can reliably and easily share, locate, and retrieve information from these systems. Enterprise-wide information integration projects integrate data from these systems to one location to generate required reports and analysis. During this type of implementation process, metadata management must be provided along each step to ensure that the final reports and analysis are from the right data sources, are complete, and have quality. This IBM® Redbooks® publication introduces the information governance initiative and highlights the immediate needs for metadata management. It explains how IBM InfoSphere™ Information Server provides a single unified platform and a collection of product modules and components so that organizations can understand, cleanse, transform, and deliver trustworthy and context-rich information. It describes a typical implementation process. It explains how InfoSphere Information Server provides the functions that are required to implement such a solution and, more importantly, to achieve metadata management. This book is for business leaders and IT architects with an overview of metadata management in information integration solution space. It also provides key technical details that IT professionals can use in a solution planning, design, and implementation process.

IBM Style Guide, The: Conventions for Writers and Editors

The IBM Style Guide distills IBM wisdom for developing superior content: information that is consistent, clear, concise, and easy to translate. The IBM Style Guide can help any organization improve and standardize content across authors, delivery mechanisms, and geographic locations. This expert guide contains practical guidance on topic-based writing, writing content for different media types, and writing for global audiences. Throughout, the authors illustrate the guidance with many examples of correct and incorrect usage. Writers and editors will find authoritative guidance on issues ranging from structuring information to writing usable procedures to presenting web addresses to handling cultural sensitivities. The guidelines cover these topics: Using language and grammar to write clearly and consistently Applying punctuation marks and special characters correctly Formatting, organizing, and structuring information so that it is easy to find and use Using footnotes, cross-references, and links to point readers to valuable, related information Presenting numerical information clearly Documenting computer interfaces to make it easy for users to achieve their goals Writing for diverse audiences, including guidelines for improving accessibility Preparing clear and effective glossaries and indexes The IBM Style Guide can help any organization or individual create and manage content more effectively. The guidelines are especially valuable for businesses that have not previously adopted a corporate style guide, for anyone who writes or edits for IBM as an employee or outside contractor, and for anyone who uses modern approaches to information architecture.

Programming Pig

This guide is an ideal learning tool and reference for Apache Pig, the open source engine for executing parallel data flows on Hadoop. With Pig, you can batch-process data without having to create a full-fledged application—making it easy for you to experiment with new datasets. Programming Pig introduces new users to Pig, and provides experienced users with comprehensive coverage on key features such as the Pig Latin scripting language, the Grunt shell, and User Defined Functions (UDFs) for extending Pig. If you need to analyze terabytes of data, this book shows you how to do it efficiently with Pig. Delve into Pig’s data model, including scalar and complex data types Write Pig Latin scripts to sort, group, join, project, and filter your data Use Grunt to work with the Hadoop Distributed File System (HDFS) Build complex data processing pipelines with Pig’s macros and modularity features Embed Pig Latin in Python for iterative processing and other advanced tasks Create your own load and store functions to handle data formats and storage mechanisms Get performance tips for running scripts on Hadoop clusters in less time

SQL Server MVP Deep Dives, Volume 2

SQL Server MVP Deep Dives, Volume 2 lets you learn from the best in the business—64 SQL Server MVPs offer completely new content in this second volume on topics ranging from testing and policy management to integration services, reporting, and performance optimization techniques...and more. About the Technology About the Book To become an MVP requires deep knowledge and impressive skill. Together, the 64 MVPs who wrote this book bring about 1,000 years of experience in SQL Server administration, development, training, and design. This incredible book captures their expertise and passion in 60 concise, hand-picked chapters. SQL Server MVP Deep Dives, Volume 2 picks up where the first volume leaves off, with completely new content on topics ranging from testing and policy management to integration services, reporting, and performance optimization. The chapters fall into five parts: Architecture and Design, Database Administration, Database Development, Performance Tuning and Optimization, and Business Intelligence. What's Inside Discovering servers with PowerShell Using regular expressions in SSMS Tuning the Transaction Log for OLTP Optimizing SSIS for dimensional data Real-time BI Much more About the Reader This unique book is your chance to learn from the best in the business. It offers valuable insights for readers of all levels. About the Authors Written by 64 SQL Server MVPs, the chapters were selected and edited by Kalen Delaney and Section Editors Louis Davidson (Architecture and Design), Paul Randal and Kimberly Tripp (Database Administration), Paul Nielsen (Database Development), Brad McGehee (Performance Tuning and Optimization), and Greg Low (Business Intelligence). Quotes

SAP Applications on IBM PowerVM

IBM® invented the virtualization technology starting in the 1960s on the mainframe, and the functionalities evolved and were ported to other platforms and improved the reliability, availability, and serviceability (RAS) features. With virtualization, you achieve better asset utilization, reduced operating costs, and faster responsiveness to changing business demands. Every technology vendor in the SAP ecosystem understands virtualization as slightly different capabilities on different levels (storage and server hardware, processor, memory, I/O resources or the application, and so on). It is important to understand exactly what functionality is offered and how it supports the client’s business requirements. In this IBM Redbooks® publication we focus on server virtualization technologies in the IBM Power Systems™ hardware, AIX®, IBM i, and Linux space and what they mean specifically for SAP applications running on this platform. SAP clients can leverage the technology that the IBM Power Systems platform offers. In this book, we describe the technologies and functions, what they mean, and how they apply to the SAP system landscape.

IBM InfoSphere Streams: Assembling Continuous Insight in the Information Revolution

In this IBM® Redbooks® publication, we discuss and describe the positioning, functions, capabilities, and advanced programming techniques for IBM InfoSphere™ Streams (V2), a new paradigm and key component of IBM Big Data platform. Data has traditionally been stored in files or databases, and then analyzed by queries and applications. With stream computing, analysis is performed moment by moment as the data is in motion. In fact, the data might never be stored (perhaps only the analytic results). The ability to analyze data in motion is called real-time analytic processing (RTAP). IBM InfoSphere Streams takes a fundamentally different approach to Big Data analytics and differentiates itself with its distributed runtime platform, programming model, and tools for developing and debugging analytic applications that have a high volume and variety of data types. Using in-memory techniques and analyzing record by record enables high velocity. Volume, variety and velocity are the key attributes of Big Data. The data streams that are consumable by IBM InfoSphere Streams can originate from sensors, cameras, news feeds, stock tickers, and a variety of other sources, including traditional databases. It provides an execution platform and services for applications that ingest, filter, analyze, and correlate potentially massive volumes of continuous data streams. This book is intended for professionals that require an understanding of how to process high volumes of streaming data or need information about how to implement systems to satisfy those requirements. See: http://www.redbooks.ibm.com/abstracts/sg247865.html for the IBM InfoSphere Streams (V1) release.

Implementing Imaging Solutions with IBM Production Imaging Edition and IBM Datacap Taskmaster Capture

Organizations face many challenges in managing documents that they need to conduct their business. IBM® Production Imaging Edition V5.0 is the comprehensive product that combines imaging, capture, and automation to provide the capabilities to process and manage high volumes of document imaging over their entire life cycle. This IBM Redbooks® publication introduces Production Imaging Edition, its components, the system architecture, its functions, and its capabilities. It primarily focuses on IBM Datacap Taskmaster Capture V8.0, including how it works, how to design a document image capture solution, and how to implement the solution using Datacap Studio. Datacap Studio is a development tool that designers use to create rules and rule sets, configure a document hierarchy and task profiles, and set up a verification panel for image verification. This book highlights the advanced technologies that are used to create dynamic applications, such as IBM Taskmaster Accounts Payable Capture. It includes an in-depth walkthrough of the dynamic application, Taskmaster Accounts Payable Capture, which provides invaluable insight to designers in developing and customizing their applications. In addition, this book includes information about high availability, scalability, performance, and backup and recovery options for the document imaging solution. It provides known best practices and recommendations for designing and implementing such a solution. This book is for IT architects and professionals who are responsible for creating, improving, designing, and implementing document imaging solutions for their organizations.

IBM zEnterprise 114 Technical Guide

The popularity of the Internet and the affordability of IT hardware and software have resulted in an explosion of applications, architectures, and platforms. Workloads have changed. Many applications, including mission-critical ones, are deployed on a variety of platforms, and the System z® design has adapted to this change. It takes into account a wide range of factors, including compatibility and investment protection, to match the IT requirements of an enterprise. This IBM® Redbooks® publication discusses the IBM zEnterprise System, an IBM scalable mainframe server. IBM is taking a revolutionary approach by integrating separate platforms under the well-proven System z hardware management capabilities, while extending System z qualities of service to those platforms. The zEnterprise System consists of the IBM zEnterprise 114 central processor complex, the IBM zEnterprise Unified Resource Manager, and the IBM zEnterprise BladeCenter® Extension. The z114 is designed with improved scalability, performance, security, resiliency, availability, and virtualization. The z114 provides up to 18% improvement in uniprocessor speed and up to a 12% increase in total system capacity for z/OS®, z/VM®, and Linux on System z over the z10™ Business Class (BC). The zBX infrastructure works with the z114 to enhance System z virtualization and management through an integrated hardware platform that spans mainframe, POWER7™, and System x technologies. The federated capacity from multiple architectures of the zEnterprise System is managed as a single pool of resources, integrating system and workload management across the environment through the Unified Resource Manager. This book provides an overview of the zEnterprise System and its functions, features, and associated software support. Greater detail is offered in areas relevant to technical planning. This book is intended for systems engineers, consultants, planners, and anyone wanting to understand the zEnterprise System functions and plan for their usage. It is not intended as an introduction to mainframes. Readers are expected to be generally familiar with existing IBM System z technology and terminology.

Better Business Decisions Using Cost Modeling

: Information is power in supply chain operations, negotiations, continuous improvement programs, and process improvement, and indeed in all aspects of managing an operation. Accurate and timely information can result in better decisions that translate into improvement of bottom line results. The development and effective use of cost modeling as a method to understand the cost of products, services, and processes can help drive improvements in the quality and timeliness of decision making. In the supply chain community an understanding of the actual cost structures of products and services, whether with new or non-partner suppliers, can facilitate fact-based discussions which are more likely to result in agreements that are competitively priced and with fair margins. Further, accurate cost models which are cooperatively developed between supply chain partners can form the basis for joint efforts to reduce non-value-added costs and provide additional focus towards operational improvement. While many organizations feel confident they have an understanding of the cost structure for products and services produced internally, cost modeling often uncovers areas where significant cost improvement can be obtained. Cost of quality is a particular type of internal cost model that analyzes the true costs associated with the production of less than perfect products and services. The development of a cost of quality model can provide insight into how products or services of higher quality can be produced at lower cost. This book provides the business student or professional a concise guide to the creation and effective use of both internal and external cost models. Development of internal cost models is discussed with illustrations showing how they can be deployed to assist in new product development, pricing decisions, make-or-buy decisions and the identification of opportunities for internal process improvement projects. The creation and use of external cost models are discussed providing insight into how their use can drive collaborative improvement efforts among supply chain partners, better prepare for price negotiations, and keep negotiations focused on facts rather than emotions--all while allowing for future discussions with preferred suppliers to focus on more strategic and operational improvement initiatives, and less on pricing. A number of detailed cost model examples are provided to educate on both how cost models are constructed, and to demonstrate how they have been effectively deployed

Privacy and Big Data

Much of what constitutes Big Data is information about us. Through our online activities, we leave an easy-to-follow trail of digital footprints that reveal who we are, what we buy, where we go, and much more. This eye-opening book explores the raging privacy debate over the use of personal data, with one undeniable conclusion: once data's been collected, we have absolutely no control over who uses it or how it is used. Personal data is the hottest commodity on the market today—truly more valuable than gold. We are the asset that every company, industry, non-profit, and government wants. Privacy and Big Data introduces you to the players in the personal data game, and explains the stark differences in how the U.S., Europe, and the rest of the world approach the privacy issue. You'll learn about: Collectors: social networking titans that collect, share, and sell user data Users: marketing organizations, government agencies, and many others Data markets: companies that aggregate and sell datasets to anyone Regulators: governments with one policy for commercial data use, and another for providing security

SAP NetWeaver MDM 7.1 Administrator's Guide

SAP NetWeaver MDM 7.1 Administrator's Guide acts as a complete resource for mastering the administration and configuration of SAP's Master Data Management solution: NetWeaver MDM 7.1. With a hands-on and practical approach, this book connects theoretical understanding with real-world application, tailored specifically for MDM administrators. What this Book will help me do Understand the core concepts and business scenarios associated with SAP NetWeaver MDM. Master the configuration of MDM Console, Servers, repositories, and the underlying database. Learn to maintain repository integrity through backup, restore, and management techniques. Automate data operations like importing and syndicating through MDM tools. Grasp the integration aspects of MDM with other SAP NetWeaver components. Author(s) Uday Rao is an experienced administrator and consultant in SAP systems, specializing in Master Data Management. With years of field experience, Uday brings deep technical insights combined with an approach that simplifies complex administration tasks. His guide emphasizes practical scenarios with step-by-step instructions that empower SAP professionals. Who is it for? This book is ideal for SAP administrators aiming to specialize in Master Data Management with NetWeaver MDM. It targets professionals with foundational knowledge in SAP who are looking to gain expertise in configuring and managing MDM systems. Novices in SAP MDM can still benefit from the guide's structured approach. Whether you're managing corporate data systems or overseeing MDM projects, this guide aligns with your goals.