Whitepaper Library
 

data quality

Results 251 - 265 of 265Sort Results By: Published Date | Title | Company Name
Published By: SAS     Published Date: Apr 16, 2015
Ecclesiastical uses SAS® to improve data quality to make better decisions that enhance the reputation of the business, affect millions of pounds of risk selection/underwriting and help to establish optimum reinsurance levels. Millions of pounds hang on decisions around reinsurance and risk selection. Achieving the best possible outcome means taking data and turning it into ‘decision-making gold’. The key is to have good data going into the process.
Tags : 
    
SAS
Published By: Trillium Software     Published Date: Jun 06, 2011
The second Web seminar in the series is titled "Data Governance". Learn how Digital River successfully implemented Data Governance across their organization.
Tags : 
trillium software, operational data quality, upstream operational systems, data governance, data monitoring, businessprocesses, digital river, data management, business applications, governance, monitoring
    
Trillium Software
Published By: Trillium Software     Published Date: Jun 06, 2011
The fourth Web seminar in the series is titled "Tips, Tools & Best Practices" which shows how you can learn to deliver high performance data quality in real-time.
Tags : 
trillium software, operational data quality, odq, performance management, real-time transactions, high performance data quality, business processes optimization, information integrity, reusable data quality, application optimization
    
Trillium Software
    
Optis
Published By: Oracle     Published Date: Nov 08, 2017
Webcast on why choose Oracle Cloud solutions
Tags : 
oracle. database. cloud, enterprise, network, quality assurance, storage management, servers
    
Oracle
Published By: IBM     Published Date: May 02, 2014
The end-to-end information integration capabilities of IBM® InfoSphere® Information Server are designed to help organizations understand, cleanse, monitor, transform and deliver data—as well as collaborate to bridge the gap between business and IT.
Tags : 
ibm, integrating big data, governing big data, integration, best practices, big data, ibm infosphere, it agility, performance requirements, hadoop, scalability, data integration, big data projects, high-quality data, leverage data replication, data persistence, virtualize data, data center
    
IBM
Published By: IBM     Published Date: May 02, 2014
This eBookoutlines the best practices for data lifecycle management and how InfoSphere Optimsolutions enable organizations to support and implement them.
Tags : 
ibm, integrating big data, governing big data, integration, best practices, big data, ibm infosphere, it agility, performance requirements, hadoop, scalability, data integration, big data projects, high-quality data, leverage data replication, data persistence, virtualize data, lifecycle management, big data strategy, data center
    
IBM
Published By: GE Healthcare     Published Date: Feb 23, 2015
Southwestern Ontario Diagnostic Imaging Network relies on Centricity™ Clinical Archive to connect 62 hospitals with disparate PACS & RIS systems, resulting in reduced storage costs and data duplication and increased productivity and enhanced quality of care. Learn more in this case study. Centricity Clinical Archive includes the following product components: Centricity Enterprise Archive, Universal Viewer ZFP, Caradigm eHIE, Centricity Clinical Gateway, NextGate MatchMetrix EMPI, PACSGEAR PacsSCANTM
Tags : 
xds, cross enterprise document sharing, unified view of patient images, archive & storage, image sharing & management, vendor neutral archive (vna), centricity clinical archive, mobile image capture, enterprise archive, archiving services, zero footprint viewer, image archive, dicom storage
    
GE Healthcare
Published By: Delphix     Published Date: Apr 14, 2015
Current test data management solutions are flawed in at least one key dimension. Technologies in the emerging Data as a Service (DaaS) category bring the benefits of virtualization to application data, delivering high-quality test data with both speed and efficiency.
Tags : 
    
Delphix
Published By: Cox Business     Published Date: Dec 19, 2016
Businesses need to plan for unforeseen events that can disrupt productivity, impair the customer experience, and possibly even threaten a business’s existence. A disruption every business needs to plan for is any event that destroys valuable data, inhibits access to data, or causes downtime of core applications. Consider the staggering amount of information your company stores electronically. What if an unforeseen event destroyed all financial records, client contacts, and application data? You wouldn’t be able to send customers accurate invoices. Your marketing efforts might be undermined. You would lack key metrics for measuring quality, profitability, and more. The losses could be staggering. In every aspect of life, it’s smart to plan for unexpected events. That’s especially true for two plans every business must have: a disaster recovery plan and a business continuity plan.
Tags : 
    
Cox Business
Published By: Dun & Bradstreet     Published Date: Feb 21, 2017
As the volume of data coming into organizations – from both internal and external sources – continues to grow and makes its way across departmental systems in many different formats, there is a critical need to create a single, holistic view of the key data entities in common use across the enterprise. Master Data Management (MDM) aims to accomplish this goal. Not surprisingly, MDM has become a significant priority for global enterprises, with the market expected to triple from $9.4B to $26.8B by 2020 according to analysts. The reality, though, is that while seemingly everyone is investing heavily in the tools to manage data, few are putting a great enough emphasis on the data itself. And that’s a problem. Poor data quality is said to be costing businesses $3.1 trillion annually – and that’s just in the US alone. The information being put into MDM tools must be mastered first and foremost.
Tags : 
managing data, data management insight, mdm, master data management
    
Dun & Bradstreet
Published By: Fiserv     Published Date: Nov 07, 2017
Digital loan origination processes can still require significant manual support, which is often inaccurate and time-consuming. This National Mortgage News paper, sponsored by Fiserv, explains how you can improve your current loan production while reducing costs and risk of non-compliance.
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience
    
Fiserv
Published By: Fiserv     Published Date: Nov 08, 2017
"Learn how you can reduce loan defects, improve data quality and simplify compliance in the mortgage lending process. Co-presented with Craig Focardi, mortgage industry executive and technology advisor, this webinar provides insights into mortgage lending process challenges and how they impact experiences for lenders and borrowers, as well as the overall performance of the loan. The webinar also provides suggestions for reducing loan defects and strategies to correct them."
Tags : 
    
Fiserv
Published By: QASymphony     Published Date: Jan 08, 2018
Data. It seems to be everywhere today and yet we can never get enough of it. But as it turns out, a lack of data isn’t our problem -- our problem is the difficulty piecing together, understanding and finding the story in all the data that’s in front of us. In software testing in particular, the need for consolidated, meaningful test metrics has never been higher. As both the pace of development and the cost of delivering poor quality software increase, we need these metrics to help us test smarter, better and faster. Fortunately, business intelligence now exists to make this goal a reality. The analytics these tools provide can help drive efficient and effective testing by providing teams with insight on everything from testing quality and coverage to velocity and more. And this knowledge can position the QA team as trusted experts to advise the entire software development team on steps that can ensure a better quality end result.
Tags : 
    
QASymphony
Published By: Here Technologies     Published Date: Sep 26, 2018
Mobile has become the first screen for consumers in terms of internet time spent. Hyperlocal location data is unique to mobile and provides a lot of new, valuable information about consumers. Today’s sophisticated buyers demand more transparency in location data science, better campaign performance and ROI from attribution modeling. Finding the right partner that can help build the bridge between real-world consumer behavior and mobile programmatic advertising in a transparent way is essential. To meet those challenges and address the demand for top quality location data, adsquare integrated a global database of HERE Places in April 2017. By leveraging HERE Places and overlaying raw location data of anonymous users with POI data points, adsquare is able to understand exactly what consumers are doing in the real world: what places they visit, when and how often. To find out more download this paper today.
Tags : 
    
Here Technologies
Start   Previous    1 2 3 4 5 6 7 8 9 10 11     Next   End
Search Whitepaper Library      

Add Whitepapers

Get your company's whitepapers in the hands of targeted business professionals.