Why partnerships are key to driving healthcare forward

Brent Shafer, CEO Philips North America

Written by: Brent Shafer

It’s no secret that our healthcare system is up against enormous challenges, such as aging populations and growing chronic and lifestyle-related diseases, while providers struggle to increase access and quality of care. However, a brighter future is at hand with a more seamless, collaborative approach by hospitals and health technology companies that put patient care as the focus. As connected technologies continue to disrupt and transform the healthcare industry for the better, healthcare leaders must bridge the gap between bringing innovations forward that can change patient care and driving actual integration into health systems. In fact, this year’s Philips Future Health Index (FHI) data revealed that both healthcare providers (86%) and consumers (61%) believe a more integrated healthcare system would improve the quality of healthcare in the United States.

Without better integration, global access to quality healthcare will continue to decline. If broadly adopted, however, connected technologies have tremendous potential to provide solutions to the resource shortages confronting healthcare in many countries. We will be better able to track and manage the health of populations both inside and outside hospital walls, while simultaneously decreasing unacceptably long wait times, rising cost and severe staff shortages.

Continue Reading No Comments

POC adviser,Wolters Kluwer

Baton Rouge general Adds POC Advisor from Wolters Kluwer to reduce sepsis

November 2, 2017 – Wolters Kluwer Health today announced that Baton Rouge General will deploy POC Advisor™, leveraging its clinical surveillance and analytics to reduce sepsis mortality and morbidity rates. The clinical intelligence platform will be a central component of the “Most Wired” hospital’s overall clinical and technology strategy.

“Using advanced technology is key to preserving and restoring health,” said Bennett Cheramie, Vice President of Information Technology at Baton Rouge General. “By making our front-line team members aware of patients with sepsis, POC Advisor will help us treat sepsis in its earliest stages.”

Continue Reading No Comments

Precision medicine: From one-size-fits-all to personalized healthcare

Scientist pipetteAdvances in technology are essential if precision medicine is going to become reality.

Imagine a future in which, rather than using symptoms to identify a disease, your genes, metabolism, and gut microbiome inform how your individual health is managed. This is the vision of precision medicine.

Traditional medicine uses symptoms to diagnose diseases, and drugs to treat these symptoms. But precision medicine aims to turn this concept on its head.

By identifying the factors that predispose a person to a particular disease and the molecular mechanisms that cause the condition, treatment and prevention strategies can be tailored to each individual. 

So, how do we get from traditional to precision medicine? Advances in genetics and molecular analysis techniques have been a deciding factor, as has getting patients involved with managing their own health.

Continue Reading

Data warhouse

The enterprise data warehouse is dead! Long live the enterprise data warehouse!

In my first article, I shared the observation that healthcare organizations are behaving like technology early adopters and deploying next-generation analytic architectures. This is counter to the healthcare industry’s customary, more conservative behavior regarding technology adoption. I offered the explanation that the well-known challenges and limitations of the traditional enterprise data warehouse (EDW) approach to analytics was the impetus for healthcare organizations being willing to try something new and different. In my first article, I shared the observation that healthcare organizations are behaving like technology early adopters and deploying next-generation analytic architectures. This is counter to the healthcare industry’s customary, more conservative behavior regarding technology adoption. I offered the explanation that the well-known challenges and limitations of the traditional enterprise data warehouse (EDW) approach to analytics was the impetus for healthcare organizations being willing to try something new and different. I also made the point, however, that the EDW approach is far from obsolete and absolutely has a role to play in a modern-day analytics architecture. This role is as a centralized repository, of trustworthy data, that has been tightly governed and can be used broadly across the enterprise by a variety of data consumers with differing skills. With an EDW, we design a place for everything – and everything must be in its place!This stands in contrast to the data lake approach, which has the benefits of far greater flexibility and agility in getting data in, but at the cost of greater complexity, skills and expertise to derive valid data and insights. The data lake also presents the challenge of effectively cataloging and managing an inventory of the data that is in the lake, along with the associated access permissions, given the diverse sources and types of data that can be supported by the data lake approach.As with any technology solution, there is seldom a 100% right answer to any question, and choosing a EDW approach versus a data lake approach to any particular data management and analytics need is no different. However, we can pick a couple of examples that illustrate a situation where an EDW is clearly the preferred approach, and conversely, an example where a data lake is clearly the preferred approach.In evaluating the preferred approach, we’ll look across six dimensions or characteristics of the data management and analytics use-cases that I have found helpful. These are:⦁ Who is the consumer of the data? Will the data be accessed and used by report writers who largely know only SQL and may be unfamiliar with the nuances and different interpretations of what the data means? Or will the data be accessed by a limited audience of data scientists and “data jockeys” with in-depth statistical knowledge and familiarity with the variety of possible interpretations of what the data may mean?⦁ How broadly will the data be used across the enterprise? Will the data be used broadly across the enterprise in support of many, diverse analytical and reporting scenarios where consistency and reliability are highly important and valued? Or will the data be used by a small group of individuals to conduct analysis or reporting, potentially on an ad hoc basis or answering questions of interest to a limited audience?⦁ How well understood is the data itself? Is the data and its meaning well understood –  either through longstanding familiarity with the data domain, meaning established through regulation, or perhaps even business practices such as financial reporting? Alternatively, is the data unknown, subject to multiple potentially conflicting interpretations, or in need of combining/augmenting with other sources of data in order to be useful?⦁ How well understood and defined are the analytic or reporting requirements from the data? What’s the plan for using the data? Are the reports and formats and analytics that will be applied to the data well-defined and understood? Or do we first need to figure out what the data might be useful for and discover how the data might be valuable?⦁ What are the sources of the data? Does the data come from internal business applications or systems that are owned and controlled by the enterprise? Where the data comes from as far as a source system or application is important, since it largely influences our ability to both understand the data and the context in which it was acquired, as well as influence this acquisition to resolve any ambiguities or other data quality issues that may be present in the data. Where we control the upstream source of the data, we have more options to correct data issues at the source and make data “right.” Absent this control, we are left with the larger challenge of getting value from data that is suspect or known to be of poor quality.⦁ Who is the audience for the insights and conclusions drawn from the data? Will the reports and/or analytic insights be used to satisfy regulatory reporting requirements? Perhaps something that needs to be certified or signed off on by the CEO or CFO? Will it be presented to the executive committee or board of directors? Alternatively, will the insights be more informational in nature to a smaller audience, perhaps interested in identifying “order-of-magnitude” effects of difference scenarios where “close is good enough” is the standard of proof?In reading these dimensions and the associated descriptions, my hope is that your reaction is one of “Well shoot! You’ve stacked the deck so badly the conclusions are obvious!” That is exactly my hope: when we decompose the who, what, how and why of the data and who will use it – the appropriate approach becomes self-evident. Guilty as charged for selecting dimensions that drive us to a conclusion to favor one approach over another, since there are literally a countless number of other dimensions that are relevant in any evaluation.So, let’s take these dimensions out for a test drive and see how they apply in some real-world situations as described in Table A, below: UES-CASE: MONTHLY/QUARTERLY/ANNUAL FINANCIAL REPORTING USE-CASE: DISCOVERY ANALYTICS COMBINING HEALTH SYSTEM ENCOUNTER DATA, WITH CLAIMS DATA, WITH GEOSPACIAL DATA TO DETERMINE ANY CORRELATIONS THAT ARE PREDICTIVE OF COMPLIANCE WITH FOLLOW-UP APPOINTMENTS. Who is the consumer of the data? Financial analysts and report writers with SQL skills only. Data scientists and “data jockeys” with skills in design of experiments, advanced analytics and statistical techniques.How broadly will the data be used across the enterprise? Data is used throughout finance and virtually every department across the enterprise to understand and quantify business performance. The data will be accessed by a small, limited group of individuals with knowledge of the data, its limitations, and the therefore the reliability of any insights derived.How well understood are the data themselves? Meaning of the data is unambiguous and frequently determined by accounting standards or other documentation. Data sets are being seen for the first time and meaning must be derived and inferred rather than established through documentation.How well understood are the analytic or reporting requirements from the data? Financial reporting requirements are well-established through regulation, accounting standards and best practices and customs throughout the industry and enterprise. While there are some ideas of what types of insights the data might yield, these ideas are uncertain and must be tested and the validity of the results vetted.What are the sources of data? Data is sourced from enterprise and departmental business applications under the direct ownership or control of the organization. Data is from external sources and must be used “as-is” with little ability to influence up-stream sources for either understanding or data integrity/quality purposes.Who is the audience for the insights and conclusions drawn from the data? Reports and analysis are submitted to regulators, used by the executive team, reviewed by the board of directors, and frequently must be certified and signed-off by the CEO/CXO. A limited audience of interested individuals with knowledge and understanding of the vagaries of the analytics and approach.CONCLUSION Enterprise Data Warehouse Approach is Clearly Preferred Data lake is Clearly Preferred
Table A: Illustrative Use-cases for the EDW and the Data LakeThe FINANCIAL REPORTING use-case clearly favors an EDW approach since nothing about the use-case requires the agility to accommodate new sources of data or answering new questions. This use-case also strongly favors accurate and reliable data that can be used broadly across the enterprise. Moreover, these data sets are absolutely of known value and worthy of the significant up-front design, governance and rigorous data management practices required to get the data right. If I’m a CEO or CFO, I can feel confident that data sourced from a well-designed, well-implemented and well-managed EDW is accurate and reliable.Conversely, the DISCOVERY ANALYTICS use-case clearly favors agility rather than intensive data management and governance discipline – and therefore a data lake approach. We know little about the sources of data, or whether the analytics that result will reveal any useful insights, and therefore we really want to “give it a whirl” and see what comes out. We certainly do not want to put a lot of upfront effort in designing a data model, or strict data quality processes and rules with accompanying data governance, for data that may prove to be of little value and not yield any real business insights. From an investment perspective, the data lake is the right amount of effort to apply – at least until we know the value of the insights may be worth more upfront effort. This is a situation where the value and trustworthiness of the result is strongly dependent on the skill and knowledge of the individual working with the data.Modern-day EDW’s are absolutely succeeding at the FINANCIAL REPORTING use-case, and this is an affirmation of their value within the enterprise. Additionally, there are many other, similar use-cases around customer acquisition, inventory management, customer retention and others that possess similar characteristics that lend themselves to the EDW approach. Where we have traditionally struggled in the past is trying to make the design- and investment-intensive EDW approach work for use-cases of uncertain value and with data that is not well-understood and is poorly behaved. Namely the DATA DISCOVERY use-case. This data discovery use-case has always been the Achilles heel of the EDW, since the upfront investment required meant we just couldn’t take on data of unknown value. Or, if we did put in the effort and it turned out to not be useful, we got a black eye for putting all that effort into something that ultimately did not yield any value. However, the data discovery use-case has not been left undone; rather, it has been accomplished through the heroic efforts of data scientists, or super-financial-analysts – using spreadsheets, scripting and other dark-arts of data manipulation to come up with an answer to an immediate, pressing need.  The opportunity with the data lake is to provide an enterprise-class platform to support these sorts of initiatives so that they are part of an overall enterprise data management strategy and architecture, rather than islands of ad hoc analytics with little transparency or ability to inspect and validate the approaches used. At this point, I have hopefully established two clearly differentiated use-cases that lead us down the path to either an EDW approach or a data lake approach. Ultimately, we need both. But of course, the real world is never so clear. Therefore, I will suggest the EDW and data lake should not be implemented or operated in isolation, but instead should operate in combination to meet the diverse needs of the enterprise.What about the situation where I need to quickly support financial reporting across a newly merged enterprise and have new data sources from this merged entity? This is a FINANCIAL REPORTING use-case but where agility is critical. Alternatively, what about a situation where a quick-and-dirty analysis conducted by a data scientist using the data lake (a classic DATA DISCOVERY use-case) yielded valuable insights that I now want to make broadly available and repeatable to data consumers with only SQL skills?  Figure 1: Modern Analytics Architecture Incorporating both an EDW and Data LakeThese situations illustrate the need for the EDW and data lake to operate in unison as illustrated in Figure 1. Equally important, a strong reason to not allow the independent islands of “data heroes” persist, as these sorts of isolated efforts do not lend themselves to an easy on-ramp for inclusion into the EDW. For quickly delivering financial reports following a merger, I can use the data lake with a “data jockey” with deep technical skills in collaboration with a financial analyst with deep business knowledge to quickly derive accurate, reliable financial reporting from data in unfamiliar formats from unfamiliar sources. Using the data lake need not imply results that are slipshod or somehow of less accuracy or validity than the EDW; rather, getting these trustworthy results just requires a different set of skills and tools. Similarly, once I have done the data discovery work to figure out the format and meaning of new external data sources, and establish the valuable questions that can be answered, it should be a straightforward and predictable process to “industrialize” this use-case by enhancing the EDW data model and building the corresponding data loading routines and appropriate data governance to promote this use-case to the EDW.In summary, the EDW is alive-and-well and has a critical role to play in a modern analytics architecture. In-fact, pairing and EDW with a data lake allows the EDW to be focused on what it really does well – and to finally shed the “costs-too-much, takes-to-long, isn’t agile” label it has for labored under for far too long. Similarly, many of the indictments of the data lake as being prone to becoming a “data swamp” can be mitigated by pairing the data lake with a well-designed, well-implemented and well-managed EDW –and then using the data lake when and where it’s virtues of agility are merited. Next up – how the do we govern data in this new world where data can exist in the EDW, or the data lake, and be used by different people, with varying skills, and for different purposes?

Continue Reading No Comments

Beyond privacy concerns: Interactive gadgets can pose threat to children’s psychology

Children, who are learning what’s appropriate social interaction, can be affected more than adults by the human-computer relationship that’s becoming more commonplace in homes. In other public health news: early menopause, the shingles vaccine, fatty liver disease, racism, and gun safety.

NPR: Parenting in the age of Alexa, are artificial intelligence devices safe for kids?
Earlier this month, the toy-giant Mattel announced it had pulled the plug on plans to sell an interactive gadget for children. The device, called Aristotle, looked similar to a baby monitor with a camera. Critics called it creepy. Powered by artificial intelligence, Aristotle could get to know your child — at least that was how the device was being pitched. (Doucleff and Aubrey, 10/30)

The New York Times: Underweight women at risk of early menopause
Underweight women are at increased risk for early menopause, a new study has found. This study, in Human Reproduction, followed 78,759 premenopausal women ages 25 to 42 beginning in 1989. Over the following 22 years, 2,804 of them reported natural menopause before age 45. (Bakalar, 10/26)

Continue Reading

Amazon is poised to enter Pharma landscape — so what will that look like?

Stat: Who wins and who loses if Amazon enters the prescription drug business

Will pharma be the next business Amazon disrupts?In industry after industry, the company has turned practices and expectations inside out — and the pharmaceutical world is the latest poised for change as speculation mounts that Amazon will soon start selling prescription medicines. Anticipation has been building for months, in fact, but it heightened last week on the news that Amazon (AMZN) obtained wholesale pharmacy licenses in at least a dozen states. (Silverman, 10/30)

Bloomberg: Amazon’s ambitious October spooks stocks standing in its path
The looming threat of Amazon.com Inc. siphoned billions in market cap from Under Armour Inc. to FedEx Corp. to Walgreens Boots Alliance Inc. — more than $30 billion combined — in October. Companies are gearing up to face Bezos’s behemoth heading into the holiday season, with some appearing ready to get creative as the state of their industries is shaken. (Smith, 10/31)

The New York Times: The more lavish the gifts to doctors, the costlier the drugs they prescribe
When drug companies give gifts to doctors, the doctors prescribe more — and more expensive — drugs. The more lavish the gifts, the greater the effect. Researchers used data from the Center for Medicaid and Medicare Services on the prescriptions written by doctors in Washington, and information from the D.C. Department of Health on gifts from pharmaceutical and medical device companies given to providers in 2013. (Bakalar, 10/25)

Continue Reading

Inc,Messagepoint

Automation spells relief for medicare advantage organizations

Daron Domino, Vice President, Messagepoint

Written by: Daron Domino

It’s a familiar, tiring drill every year for those who work for Medicare Advantage Organizations (MAOs). There is a frantic rush from the time your organization makes its bid submission on the first Monday in June through the middle of August when plan materials need to go to print. Material preparation teams are tasked with preparing benefit materials such as the Annual Notice of Change, Evidence of Coverage, and Summary of Benefit that are compliant with CMS marketing guidelines, ensuring that all regulatory language, benefits and operational information is 100 percent accurate. Sitting between Product and Compliance, material preparation teams are challenged during the annual enrollment period (AEP) preparation as they process a sea of ad-hoc change requests.

Because of CMS timelines and the Sept. 30 in-home date, the annual update process typically starts ahead of benefits and plan approval being complete. So, there is risk of doing work to prepare materials for a plan, or plans, that may not receive CMS approval.

For most MAOs, the annual materials preparation process has been largely manual, resulting in increased efforts and risk of error. Many have attempted to automate the process only to find significant lead-times, costs, complexity, and ultimately a different set of challenges.

With a traditional document automation approach, data and rules drive the process. Those taking this approach spend most of their time managing a long list of business rules. One company we interviewed ended up with thousands of rules to drive benefits and content across their Individual market and Group plans. Automation was intended to simplify the annual process, reduce time and effort, and ultimately eliminate human error. Their attempt resulted in a process that requires an IT skill-set to set up and manage complex rules and a complex QA process to assess the impact of rule changes across plans, both of which have effectively eliminated any cost benefit. As a result, the team supporting materials preparation continues to work endless hours each summer.

Continue Reading No Comments

EMR data integration

EMR data integration: Still a long way to go

Bill Sun, X by 2

Written by: Bill Sun

The advent of Electronic Medical Records (EMR) has been a boon to the operational and analytical abilities of healthcare analytics firms and medical providers, allowing them to more accurately document, track and create the analytics required to better care for patients. Likewise, patients have benefited from the ability to track and monitor their care needs across time, allowing proactive actions to take place when identified and recommended by their healthcare providers. One of the core goals of an EMR-based medical practice is to improve the overall quality of care over time across their particular patient population. This is welcome news all around, but it ultimately entirely depends on the effective and efficient creation, organization, normalization and distribution of medical practice and patient-related data. And that’s where the trouble begins.

For all its benefits—realized and potential—the process of aggregating and consuming EMR (and its offspring the Electronic Health Records or EHR) information is still problematic at best. To date, there is no single technical standard for creating, formatting and distributing EMR data between healthcare analytics firms and EMR vendors. The net result of that is the sub-optimization of the accuracy and usefulness of EMR data. That’s not only an operational issue for healthcare analytics firms but also a serious quality of care issue for patients.

Continue Reading No Comments

MACRA Success,RCM

Re-evaluating the Role of RCM for MACRA Success

It’s hard to believe we’re already well into the latter half of the first Medicare Access and CHIP Reauthorization Act (MACRA) reporting year. With the clock ticking, pressure continues to mount for physician organizations to close the inaugural year with strong momentum before 2018. Although CMS recently proposed several changes that may make Year 2 of the MACRA Quality Payment Program (QPP) less challenging for some providers, it could be more demanding for other organizations.

For Year 1, healthcare leaders identified “revising data management/reporting mechanisms to meet new reporting requirements” as the top QPP challenge within the fifth annual Healthcare IT Industry Outlook Survey. That difficulty is unlikely to change for 2018. Because the QPP ties provider reimbursement incentives to care quality, improvement activities, costs and electronic health record (EHR) utilization, alignment has been critical this year, but will be especially crucial moving into 2018. Given that, what opportunities can the revenue cycle management (RCM) team take advantage of to improve their MACRA performance for the remainder of this year into next?

Assembling the Right Team

Traditionally, provider organizations have focused reporting needs on claims and reimbursement data, working within separate data silos, disconnected from associated facilities. In the transition to value-based care, they’re finding it more imperative to work cohesively across three key resource groups to access and obtain the right data for MACRA needed to attain positive or even neutral adjustments to reimbursement.

Providers need to form a strategic cross-disciplinary team including representation from financial, clinical and operational IT departments to maximize access to the right data across the patient-care continuum. Considering the two reporting paths, either Advanced Alternative Payment Models (APMs) or Merit-based Incentive Payment System (MIPS), the revenue cycle representatives on the committee can offer key insights into program measure selection.

In fact, RCM representation is so critical that one might argue that successful MACRA strategy cannot move forward without it. This financial team offers specific insight into which MACRA measures to select, by tapping into their knowledge of values and performance from past physician incentive reporting programs. For example, historical data from the Physician Quality Reporting Program (PQRS) and the Value Based Payment Modifier (VBPM) can point to which measures offer the highest potential for positive reimbursement adjustments. The QPP provides options of measures to report within each category, and the right scoring criteria can make a big difference in reporting success.

Making the Most of MACRA
With RCM representation as a critical decision maker, consider utilizing theses best practices to excel with MACRA QPP reporting.

  • Get the basics. Get a solid understanding of the QPP and the two path options. Consider the best reporting measures from the full set of options for your organization. Utilize industry resources like the Centers for Medicare & Medicaid Services (CMS) QPP measure guide for your pick-your-pace reporting to avoid penalty before the first year ends.
  • Use your physician group’s medical specialties mix to your advantage. Many program measures are focused on specific population health concerns, which may relate to some medical specialties more heavily than others. If your providers already have specific-focus care programs for one of the chronic disease groups, attaining key measures for those may be easier.
  • Check out helpful resources. CMS released several resources for MIPS-eligible clinicians through the QPP. The MIPS participation factsheet covers program exemptions, participant expectations and guidelines. CMS-approved MIPS qualified registries serve as a data submission option on behalf of both individual and groups of eligible clinicians.
  • Look ahead into expanded exemptions. QPP year 2’s leniencies relieve small and rural practices by expanding exemption threshold to cover clinicians or groups who have billed less than $90,000 in Medicare Part B or treat fewer than 200 Part B patients. Your RCM staff should be able to determine quickly if your practice is below this threshold.
  • Set the foundation. MACRA reporting is lost without accurate data. Data analytics tools can work with optimized EHR systems to effectively collect, maintain, document and analyze meaningful patient data, moving beyond the abyss of unstructured information and capturing the entire picture of patient care. Likewise, while providers can continue using 2014 certified EHR technology (CEHRT) for MIPS Year 2, those who use 2015 CEHRT are eligible for a 10 percentage point bonus under the ACI category.
  • Know where you stand. If you submitted quality data in the last calendar year, consider your Quality and Resource Use Report (QRUR) from CMS, which analyzes performance at the Tax Identification Number (TIN) level. Reviewing this will help you assess performance in terms of cost and quality to consider areas to improve. Aspects of this report, like PQRS and Value-Based Payment Modifier, are rolled into MACRA. In addition the QRUR will highlight your physicians ranking among their peers to give you added insight into how forecasting will impact reimbursement as the reporting year unfolds.
  • Ensure accurate coding. Make sure billers and coders fully assess the coding system to find inaccuracies, looking for risks like downcoding or missing modifiers that can impact the overall picture of care and revenue opportunity. The importance of this type of post-care analysis cannot be overstated as these steps can dramatically alter a physician’s composite score.
  • Use 2017 to solidify footing. Reporting requirements will only accelerate after 2017. Although the pick-your-pace guideline applies this year, 2018 requires a full year of data for both the quality and cost categories (though cost has no weight on the final score). An organization should use this initial reporting year to reassess performance improvement measures while cementing its MACRA governance committee for long-term success. Under MIPS, adjustments to reimbursement and/or penalty opportunity increases to 9 percent by 2022, so use this first year as a jumping point.
  • Consider future path options. While most organizations may report under MIPS within this first year, they don’t necessarily have to follow suit in future reporting periods. With the insight from financials, decide whether MIPS or APMs reporting best fits long-term stability and growth.
  • Explore virtual groups. An addition to the QPP for Year 2 is virtual-group participation in MIPS. Virtual groups consist of solo practitioners and groups comprised of 10 or fewer MIPS-eligible clinicians coming together virtually for 2018 performance participation with at least one other solo practitioner or group. Your RCM staff is integral in deciding if a virtual group will be financially advantageous for the practice in Year 2 and afterwards.
  • Evaluate future composition of physician groups. Change is constant within the overall composition of physician groups, both from the perspective of new additions or that of attrition. Managers need to consider this financial perspective for impact on MACRA reporting since there is a two-year gap between financial reporting and reimbursement adjustments. QPP final scores will impact providers beyond reimbursement as physician scores will be published online and shared on third-party sites.

So whether you are beginning your MACRA reporting path, moving through strategic planning or in the midst of data collection, consider the significance of RCM on QPP reporting alignment and success, including governance leadership and insight into operational effectiveness. Without financial entities at the MACRA decision-making table, the full picture of reporting measure selection and outcomes cannot be seen.

 

Continue Reading No Comments

healthcare payments,VPay

Four steps to optimizing and streamlining healthcare payments

Jeffrey W. Brown, President, VPay

Written by: Jeffrey W. Brown

The stress level in physician practices is high – and not just for patients worried about their health. Office staff must balance patient scheduling and visits with office tasks and work flow, plus keep up with billing and collections. On top of that, it can seem like every task you undertake is governed by regulations, mandates, procedures and laws.

Too often physician practice employees may feel like their choices are limited, with finite options that don’t necessarily deliver the optimum, or desired, result. This can feel particularly true of payments—which are essential to a practice’s existence, but can be difficult and cumbersome to process.

There’s good news for practices spending too much time and effort on managing payments. You have options that can streamline payment and reconciliation, minimize risk and reduce costs. Perhaps surprisingly, practices should know that the best payment process may consist of several methods; you can adjust how you receive payments based on your unique practice requirements.

Optimizing payments, reducing costs
The following are four steps you can take to optimize your practice payments process and reduce costs.

Continue Reading No Comments

Get our eNewsletter

Email Marketing powered by StreamSend

Please follow & like us :)

RSS
Follow by Email
Facebook
Google+
https://us.hitleaders.news/category/vendors">
Twitter
©2017 HIT Leaders and News LLC. All rights reserved.