Click here to subscribe to our RSS feed

FYI Solutions Blog

Oct 21, 2014

Seven (7) overlooked benefits of temp IT staffing vs. direct permanent hiring.

Author:  Ralph Cetrulo

Here are seven commonly overlooked benefits of IT Staffing versus direct permanent hiring of IT talent.

  1. Access Top Talent – IT staffing firms attract, employ and recruit highly skill IT talent. Often, the talent is highly specialized and for that reason, many IT consultants are not even interested in full-time employment.  Top talent can also come in the form of retired baby boomers that possess subject matter expertise and can return as consultants.

 

  1. Try Before You Buy – Hiring the right employee is a challenging process. Hiring the wrong employee is expensive, costly to your work environment, and time consuming. In some cases, converting a consultant or temp allows you to thoroughly assess the talent skill level and cultural fit before making an employee. The working interview is best way to hire!

 

  1. Flexibility – Firms that don’t use consultants are overstaffed. Consultants offer much faster hiring than direct hires, the ability to hire for specific projects or specific skills, and the option to terminate the contract is simple and risk free.

 

  1. Eliminate Payroll Tax – Some more often overlooked benefits include eliminating federal and state payroll filing and no year-end w-2 administration costs. Guarantee IRS compliance by converting 1099 independent consultants into staffing company consultants.

 

  1. No Unemployment and Workers Comp Exposure – With consultants, there are no obligations to pay unemployment insurance premiums or worker’s comp administration expenses for the consultants. This is all handled by the staffing firm.

 

  1. No Benefit Administrations – This is because the consultant’s employee benefits are managed by staffing firm.  Also, contract employees are not eligible for the same benefits as employees.

 

  1. Eliminate HR Issues – With consultant hiring, there is no formal (singular) corporate hiring process, which reduces the need for HR support. And most importantly, all issues are handled by the staffing firm immediately.

 

The Bottom Line is Reduced Cost!!!

  • Consultants are paid out of an Operating budget. That means it’s an expense that can be written off.
  • No administrative cost for hiring and paperwork
  • No payroll expense
  • No workers comp or employee benefits administration
  • No added expense of paid time off, Vacation, or Holiday pay
  • Less supervisory cost
  • No training costs

The next time you have a staffing need, consider the pluses that come with a temporary IT staffing arrangement.  FYI Solutions has been a leader in Information Technology staffing for over 30 years.  Please contact us for more information on how we can help you fulfill your staffing requirements.

Share
Oct 15, 2014

Top Facebook Tips & Tricks you should know

TOP FACEBOOK TIPS AND TRICKS YOU SHOULD KNOW!

Author: Patty Ploykrachang

As Facebook becomes as much a corporate tool as a location for connecting with friends and family, it is good to know things that can be done to get to the information you need to understand your customers and prospects without sacrificing your privacy, or the privacy of others. Here are a few hints that we have found useful at FYI Solutions.

Facebook makes its money from knowing who their users are – and what they are interested in.  They do this by finding out what the user is searching for, both on and off their website. The company basically tracks almost everything that you do.  To prevent this, you can use a browser extension such as “DISCONNECT” or “GHOSTERY” . They both are available for Firefox, Chrome, Opera, and Safari. Once you install it in your browser, it blocks Facebook from retrieving information about what websites you’re visiting. “DISCONNECT” is available for your mobile device as well.

Facebook will never notify you about messages from people who are not in your network.  You may be missing out on hidden messages from others! If you receive a message from someone and you DO NOT have mutual friends with this person, your message automatically gets filtered into the “Other” message folder (located right next to “Inbox”). So next time you are checking your messages, keep in mind the “other” folder. Please see the picture below.

message other

Stop auto play videos on your mobile and desktop. While scrolling down the news feed, it can get irritating having videos play by themselves.  Click here to find out how to disable auto video play on your mobile. From your desktop, select the down arrow in the top right “Settings”, then “Video” from the bottom left side menu and then “Auto Play Off.” Please see the picture below.

fb video play

Most Recent vs. Top Stories.  You can change what you see on your news feed. By selecting the down arrow next to “News Feed” on top left of your news feed, you have the option to view either “Most Recent” or “Top Stories.”

Your customized list isn’t private.  Be aware whenever you make a custom list and post to that list — you will can expose whoever is added to that list. Please see the picture below.

fb pic

Log out remotely.  Don’t worry if you forgot to log out of someone else’s computer.  You can sign onto facebook and log out remotely by clicking edit next to “Where You’re Logged In.” This will show you where your account is logged in and you can choose “end activity” to log out right away. OR on the top right choose “Settings” then “Security” tab.  You can choose to be notified via SMS or email if a new computer or mobile device logs into your account.

FYI Solutions has been a leader in business analytics for over 30 years. Tools like Facebook and Twitter have opened up a new area within Analytics:  Social Media Analytics. For more information about Social Media Analytics, and how to use social data to understand your customer and improve ROI, contact us.

Share
Oct 01, 2014

How You Can Identify Consulting Firms That Value You Most

Author: Gregg Ruoti

By last count, there are literally hundreds of IT search/consulting firms in the Greater NY metropolitan area.  Adding this number to out of state and sometimes offshore firms that service the IT needs of this area, this makes for an insufferable number of firms competing for the right to employ your services.  These firms range from multi-billion dollar publicly traded firms, to individuals working out of their homes. The dilemma is how to best narrow down this huge list to a manageable amount of quality firms that are the most likely to result in you finding your next project.

Obviously a common place to start is using your network of trusted associates to recommend search/consulting firms for you.  They will have had first-hand experience with these firms and should be able to tell you their preferred point-of–contact at these firms.

But your next step involves reaching out to companies that neither you nor your associates have experience with.  This is where you need to do some research to identify companies and individuals at these firms who truly have credibility in your technology domain.  If you want to reach businesses that truly specialize in your field, you need to seek out the firms that have dedicated the time, money and effort to establish themselves as subject matter experts dedicated to providing solutions to their clients in specific areas of technology.

Using an example of a Business Intelligence Architect with Cognos, this individual should be seeking out search/consulting firms following this formula:

1. Look for companies that have either a Business Intelligence or Business Analytics Practice.

Also browse to see if this firm Attends or Sponsors Conferences or provides training in Business Intelligence or Business Analytics

This translates that this business has dedicated substantial financial resources to this vertical and have full time employees dedicated to finding and creating solutions to their clients’ needs in this arena.

2. Seek out search/consulting firms that are Certified IBM Business Analytics Business Partners.

This Certification signifies this firm is very serious about providing Cognos based solutions to their clients with people just like you.

3. Go to LinkedIn and review their employees to see if any of them carry Certifications such as (but not limited to):

  • IBM Certified Solution Expert
  • IBM Certified Developer
  • IBM Certified Designer
  • SPSS
  • IBM Business Analytics Certification
  • Certified Business Intelligence Professional (CBIP)

If after reviewing some of their backgrounds you see the firm has employees or consultants with relevant certifications/qualifications, that is a strong indicator this company attracts good people and invests in their training to keep them up to date and marketable.

Clearly depending on your area of IT specialty, you would fine tune the parameters of your search for type of Practice, Partnership and Certifications.  This method will minimize the distractions of the multitude of ill-suited firms and help you identify the quality search/consulting firms that place the most value on your particular skillset and thus will better integrate you into their culture and their clients.

FYI Solutions has been a leader in specialized staffing and solutions for 30 years. Please check our website at www.fyisolutions.com to see some great opportunities available through us!

 

 

Share
Sep 25, 2014

Tips for your next job interview!

Author:  Michele D’Aries

I recently read an article by Cecile Peterkin on “15 ways to win a job interview”.  These are great tips and a reminder on what we can do to win the job.  When candidates go for an interview, they should be prepared.  Research the company by going to their website and be sure to read the CEO’s message which you will find on the web site or by reading the Annual Financial report for the company.  Make sure, if you are on LinkedIn, that your profile is suitable for the position.  Know how to greet your interviewer and how to match their style. After the interview is done, ask if the interviewer has any questions to ask of you.  One question I ask my candidates to consider asking the interviewer:

Is there any reason that you think I would not be a good candidate for this position?

This is not an easy question to ask but if the interviewer has this thought it will give you a chance to present a rebuttal.  At the end of the interview, thank the interviewer for their time and let them know you are very interested in the position and you would like them to consider you to be part of the team.

I would suggest taking time to read this article “15 ways to win a Job Interview” by Cecile Peterkin . You just may pick up a few tips to beat out the competition!

http://www.streetdirectory.com/travel_guide/187958/job_interview/15_ways_to_win_a_job_interviews.html

FYI Solutions is a leader in specialized staffing and business analytics.  We look forward to placing on that next position! Contact us for more information.

Share
Sep 18, 2014

Big Business is Watching!

Author: Marianela Peraza

Big business is watching and they will continue to watch more carefully in the next few years!  From what your favorite TV shows are to what your favorite candy bar is, businesses that will continue to grow successfully will need to know this.   In a new report by Markets and Markets http://www.marketsandmarkets.com/Market-Reports/social-media-analytics-market-96768946.html, Social Media Analytics is expected to grow from $620.3 Million in 2014 to $2.73 Billion by 2019! It is no surprise as the benefits for businesses are so apparent that not considering the significance of social media monitoring and intelligence can cause serious damage to a business’s brand if they are not meeting their customers’ expectations.  Understanding what your customers like and what they purchase can be very powerful for the company’s marketing. Even knowing when a customer is not happy with your brand can provide very valuable information to help improve your product, customer service, etc. If leveraged well, social insights can reveal consumer opinion and trends, and can be useful to make future predictions.

When Social media analytics began, it was all about tracking the number of Fans/Followers, the number of “shares”, and website visitors. Today, those metrics have changed as better tools arose to reduce big data into a more manageable group of metrics.  Does your strategy translate into meaningful insights that guide your business tactics? If the answer is no, you are in good company. Many businesses still struggle to produce actionable insights from these metrics. In an article written on www.business2community.com, they offer some six key reasons as to why there is a disconnect between real-time social media analytics and ROI.

1. Speed

Because social media is always “on”, 24/ 7, and literally MILLIONS of pieces of social media appear every second. WHO can keep up with all that?
So, while the massive AMOUNT of data slows down the social media analytics process, other factors account for why insights are so slow to emerge.

2. Getting social media metrics to the right people.

Often, social media is treated like the ugly stepchild within the marketing department and real-time social media analytics are either absent or ignored.

Real-time social media analytics create serious challenges for many organizations. Often, organizations are married to an old paradigm — a vestige of by-gone days when data was hard to get, taking months of data gathering and analysis. These organizations didn’t integrate data gathering into tactical and strategic decision-making because they couldn’t. Incorporating real-time analytics just isn’t possible within their existing environment.

For one thing, real-time analytics requires moving analysts closer to decision-makers and enabling decision-makers with analytic skills for ad hoc data analysis. But, that’s not what most businesses look like. Many decision-makers lack the analytics skills necessary for ad hoc analysis.

New arrival, Uber, which runs a ride-sharing program that competes with taxi companies and car services, uses real-time analytics to show how people move around a city at any given time, allowing Uber to optimize their customer service. Placing cars nearby reduces competition with local cab companies and real-time analytics provide insights necessary to do that. To do this, Uber uses real-time data to incentivize more drivers to provide services by raising the price of a ride.

Others such as Samsung and NASCAR do a great job of providing real-time social media analytics to guide decision-makers. NASCAR uses a control center to monitor chatter surrounding their events.

3. Visualization

Visualizing real-time social media analytics is another key element involved in developing insights that matter.

Face it: human beings don’t do a great job of processing long tables of numbers. Notice on NASCAR’s command center, much of the date is displayed visually.

Simply displaying values graphically helps in making the kinds of fast interpretations necessary for making decisions with real-time data, but adding more complex algorithms and using models provides deeper insights, especially when visualized.

4. Unstructured data is challenging.

Unlike the survey data firms are used to dealing with, most (IBM estimates 80%) is unstructured — meaning it consists of words rather than numbers. And, text analytics lags seriously behind numeric analysis.

While unstructured data tends to muck-up any kind of analysis, it’s especially challenging in the context of real-time analytics, because you want interpretations IN REAL TIME. Handling text in real time often means using computer-generated translations of the written word. However, no computer can effectively categorize much of what’s written in social media where “bad” might mean bad or it might mean good, depending on context, relationship, and other variables.

5. Increasing signal to noise.

Social media data is inherently noisy. Reducing noise to even detect signal is challenging — especially in real time. Sure, with enough time, new analytics tools can ferret out the few meaningful comments across various social networks, but few can handle this in real-time.

6. A “wait and see” attitude.

Again, businesses are used to a certain operational model that makes real-time social media analytics challenging. For instance, we listed to a presentation by an analyst from NPR. He showed complex A/B testing used to determine the effectiveness of headlines, even whole articles online. As a statistician, he’s concerned about achieving statistical significance in his testing before making decisions.

And that’s great if you’re talking about putting $100 million into building and marketing a product, but doesn’t make much sense in the fast-paced world of social media. Real-time analytics require real-time decisions. Period.

If it’ll take you several days to gather enough data for statistical significance, forget it. Especially if you’re only trying to determine which headline does better, by the time you have a statistically significant answer, no one cares anymore. The news trend has moved to another topic.
FYI Solutions is a leader in business analytics and has partnerships with IBM, Tableau, and Microsoft. Give us a call to understand how to incorporate social media analytics into your environment and reap the benefits of increased understanding of your clients and prospects.

 

Share
Sep 10, 2014

Data munging or data wrangling, a latest process in the data world!!!

Written by: Joe Wasiuk

I think this is an interesting article from the NY Times, written by Steve Lohr. The term “data science” has been around for a longer period of time than I thought. I would imagine it is now widely known that companies have a demand for “Data Scientists”. It was interesting to read that data scientists spend from 50 percent to 80 percent of their time mired in this more mundane labor (process) of collecting and preparing unruly digital data, before it can be explored for useful insight.

Please see a few of the article highlights below and here is a link to the full article:

http://www.nytimes.com/2014/08/18/technology/for-big-data-scientists-hurdle-to-insights-is-janitor-work.html?_r=0

Technology revolutions come in measured, sometimes foot-dragging steps. The field known as “big data” offers a contemporary case study. The catchphrase stands for the modern abundance of digital data from many sources — the web, sensors, smartphones and corporate databases — that can be mined with clever software for discoveries and insights.
Yet far too much handcrafted work — what data scientists call “data wrangling,” “data munging” and “data janitor work” — is still required.

Several start-ups are trying to break through these big data bottlenecks by developing software to automate the gathering, cleaning and organizing of disparate data, which is plentiful but messy. The modern Wild West of data needs to be tamed somewhat so it can be recognized and exploited by a computer program.

Timothy Weaver, the chief information officer of Del Monte Foods, calls the predicament of data wrangling big data’s “iceberg” issue, meaning attention is focused on the result that is seen rather than all the unseen toil beneath. In the food industry, he explained, the data available today could include production volumes, location data on shipments, weather reports, retailers’ daily sales and social network comments, parsed for signals of shifts in sentiment and demand.

But if the value comes from combining different data sets, so does the headache. Data from sensors, documents, the web and conventional databases all come in different formats. Before a software algorithm can go looking for answers, the data must be cleaned up and converted into a unified form that the algorithm can understand. Data experts try to automate as many steps in the process as possible. “But practically, because of the diversity of data, you spend a lot of your time being a data janitor, before you can get to the cool, sexy things that got you into the field in the first place,” said Matt Mohebbi, a data scientist and co-founder of Iodine.

The big data challenge today fits a familiar pattern in computing. A new technology emerges and initially it is mastered by an elite few. But with time, ingenuity and investment, the tools get better, the economics improve, business practices adapt and the technology eventually gets diffused and democratized into the mainstream.

FYI Solutions is a leader in Business Analytics, and can help you get started on the “data wrangling” required for Big Data success. Contact us for more information.

Share
Aug 26, 2014

Predictive Analytics with Tableau + R

Author:  Joe Rodriguez

Want to know which airport, airline and flight will get you to your destination on time? FYI Solutions hosted an event on Tuesday August 19th where a demo was presented to illustrate how Tableau 8.2 with open source R could be used for such predictive analytics. Our partner Tableau Software conducted the demo using Research and Innovative Technology Administration (RITA) data available to the public. RITA coordinates the U.S. Department of Transportation’s (DOT) research programs.  Roughly 155 million rows of data going back several years were used for this particular demo.

RITA: http://www.transtats.bts.gov/Tables.asp?DB_ID=120

For this demo instance, an extract file was used and the Tableau software intuitively created the metadata of dimensions and measures based on the data types. From that point forward, dragging and dropping data elements for analysis onto the visualization canvas was performed with ease and extraordinarily fast considering the ~155 million rows of data. The analysis itself assumed the role of different individuals in a typical company that would have unique visualization interests, and each one was immediately able to see patterns in the data that would otherwise not be obvious. Four separate visualizations were created and brought together in a single interactive dashboard that responds to specific filters that are applied – all this in a matter of minutes!

Data virtualization overview pic 1

The next assumed role was that of a Business Analyst tasked with producing a forecast. This is where integration with open source R was used to perform statistical analysis on historical data to produce a forward looking forecast. Integration with R requires scripting the predictive model(s) that are needed for a particular business case. For our demo, the scripts were prewritten and applied. A rerun of the visualization including the R script produced the forward looking forecast, showing the predicted pattern several months out from the latest date.

Our event wrapped up with a discussion and demo of new Tableau 8.2 features including recently announced support for MAC and the new Story Board. For the latter, an example was created using the demo visualizations that were presented, and annotated to explain how a conclusion was reached based on the steps taken.

Audience feedback to our event was very favorable and FYI Solutions will be scheduling a similar online event for those who could not attend in person. If you are interested in attending the forthcoming online event comprising Tableau with open source R for predictive analytics, please send us an email with your contact information at events@fyisolutions.com.

FYI Solutions, a Tableau Reseller Partner, has a strong foundation in data warehousing. We help our clients solve their data and performance issues by using proven data management and optimization practices. The result yields trusted reports and visualizations delivered on a timely basis and meeting expectations. FYI Solutions’ partnership with Tableau Software amplifies our mutual commitment to provide unparalleled value to the clients that we serve.

 

Share
Aug 20, 2014

Tailor Your Resume

Author: Sean Smallman

Applying for jobs can be a frustrating endeavor. More often than not, you receive no response to your application, and your resume lost in the proverbial black hole. Although there is no way to guarantee that your resume avoids this fate, one approach is certain to increase the likelihood of your resume generating a response and ultimately securing an interview. This approach is called “tailoring your resume.”

Tailoring your resume is the process of identifying the required skills and technologies in a job description and fine-tuning the description of your professional experience to accurately display your experience with those requirements. When applying to a job posting, your resume often ends up in the hands of a recruiter or human resources representative before being passed along to the hiring manager. Chances are these “gatekeepers” are non-technical and extremely busy. Tailoring your resume to the job description makes it easier for a gatekeeper to quickly identify your relevant skills.

Below are three simple steps to tailoring your resume:

Step 1 – Identify Keywords – The first step in the process of tailoring your resume is to read the job description to identify and highlight key words and technologies.

Step 2 – Create a Qualifications Summary – a Qualifications Summary is a section of your resume that gives a comprehensive yet concise snapshot of the skills and technologies that you’ve utilized throughout your career. The technologies highlighted in the previous step should be included at the top of this section to catch the eye of the recruiter.

Step 3 – Customize your Professional Experience – the Professional Experience section of your resume is your chance to elaborate on the skills and technologies listed in the Qualifications Summary, especially those relevant to the job description. Describe the context in which you’ve used these skills and technologies, any measurable success you’ve had, and the value that success has brought to each company or project.

Tailor your existing resume using these three steps when applying to each individual job posting. The key to a well-tailored resume is keeping key words and technologies at the top of each section of your resume. These skills need to be relevant to the position for which you are applying, they need to be skills you are currently using, and they need to be consistent throughout your resume. If you work towards creating a resume that is relevant, current and consistent for each position you apply for you will undoubtedly increase your chances of securing an interview.

FYI Solutions is a leader in strategic staffing. For more information about opportunities through FYI Solutions, please contact us.

Share
Aug 08, 2014

Data Virtualization Overview

Written by: Kevin Jacquier

Overview

This document has been written to provide an entry level introduction to Data Virtualization.  The majority of the information in this paper is based on information gathered during a one day class presented by Dave Wells at a TDWI Conference entitled “ TDWI Data Virtualization: Solving Complex Data Integration Challenges.”

What is Data Virtualization

There are many definitions of “Virtualization” and “Data Virtualization” that one can obtain from dictionaries, Wikipedia, and industry experts.   To keep it very simple,  “Data Virtualization”  is providing access to data directly from one or more disparate data sources, without physically moving the data, and providing it in such a manner that the technical aspects of location, structure, and access language are transparent to the Data Consumer.

What this really means is that Data Virtualization makes data available for Business Analytics, without the need to move all the information to a single physical database.   It is important to keep in mind that there are times when this is not possible directly from the source.  There are also times when it is much more efficient to use Data Virtualization directly from the source rather than replicating the information.

There is one key aspect of Data Virtualization to keep in mind.  Data Virtualization does not replace ETL; it complements it.  What this means is that Data Virtualization doesn’t work well when there needs to be significant transformations or complex business logic from the source before it can be used by the Data Consumer.  Data Virtualization works well when a Data Warehouse is its source or when the Source Data can be accessed with minimal complexity.  So why Use Data Virtualization?

Data Virtualization removes the need to move data from Data Warehouse to Data Warehouse or even Data Warehouse to Data marts, which many companies do, as this is the only way to make the data available to their applications.   Data Virtualization works very well when the source data is well defined and readily accessible for business logic.

Data Virtualization is primarily based on a Semantic Layer that creates views over the Source Data.  These views are set in layers that consist of 1) Physical Layer or Connection Views [access to the source data], 2) Business Layer or Integration Views [linking data from the different sources], and 3) Application Layer or Consumer Views [presenting data in a manner that is understandable by the Data Consumer].  There is no actual ETL with Data Virtualization; it is all just a series of Views.   This serves as both advantages as well as disadvantages of Data Virtualization.  The Data Virtualization Semantic Layer sits between the Source Data and the Delivery Applications (i.e. Business Analytics).

The diagram on the following page represents these layers, as well as the Source and Data Consumer layers.  This diagram was taken from a document that is available through “Composite Software” which is one of the leading Data Virtualization software companies.

Data virtualization overview pic 1Please note within the Diagram above, that the Data Sources Can be data in any format, including “Big Data” which could include NoSQL, Hadoop, Web Services, and Internal and external Cloud data.  In addition, there can be a multitude of different Data Consumers, including all the major “Business Analytics” vendors.

The Data Virtualization software will have an Optimizer that optimizes the Semantic Layer Query SQL generated before sending it to the actual Data Sources.  In addition, the Data Virtualization software has very hefty in memory caching.  This is the primary difference between what Data Virtualization software offers over some of the BI Vendor’s “Caching” capabilities.  Data Virtualization can keep a lot of data within its Memory; therefore, it can provide efficient response time.   Data Virtualization works very well when the Sources have High Volume data, but the Data Consumers ask for Low Volume Summaries.  In other words, data virtualization should not be used as a data extraction or data dump source, but as a Business Analytics source.

Although we may have strayed a little bit above, one of the really key aspects of Data Virtualization is to make data available and allow Data Consumers from different areas to access the data virtually, not physically.  Application Specific Consumer Views can be created within the virtual space, eliminating the need to copy (materialize) the data into another application or database.

In addition to the above mentioned features of Data Virtualization software, the following are also features found within the main Data Virtualization vendors:

  • Data Governance: Data governance functions of User Security, Data Lineage, and tracking and logging of access and use.
  • Data Quality: Data Cleansing, Quality Metadata, Monitoring, and Defect prevention.
  • Security:  User Security can be applied once in the Data Virtualization software and then any application accessing data via Data Virtualization has this security automatically applied.
  • Management Functions: Management functions of the Data Virtualization environment such as server and storage monitoring, network load monitoring, cache management, access monitoring, performance monitoring, Security Management (Domain, Group, and user levels).

The following diagram from Composite Software, shows their “Platform” or Functional Areas.

Data virtualization overview pic 2 png

Benefits of Data Virtualization

Data Virtualization allows for a very agile development environment.  The primary reason for this is due to the fact that Data Virtualization is based on the creation of Views into the data, not actual coding of Database Objects (Tables, Views, Procedures) and ETL Code as needed to support Data Materialization (data warehouse/ETL).  This allows for much quicker, cost effective development cycles over data Materialization projects.   Please note, however, that Data Virtualization isn’t always a replacement for Data Materialization, there are times when ETL is necessary and Data Virtualization isn’t the only solution.  However, Data Virtualization can still compliment the Materialized solution.

Data Virtualization provides the following Business Benefits over Data Materialization:

  • Supports Fast Prototype Development
  • Can be an interim solution to final ETL Project
  • Quicker time-to-solutions for business
  • Respond to increasing volumes and types of data
  • Increased data analysis opportunities
  • Information completeness
  • Improved information quality
  • Reduced data governance complexity
  • Better able to balance tie, resources, and results
  • Reduced Infrastructure Costs

The following Technical Benefits are obtained through Data Virtualization over Data Materialization:

  • Ease of Data Integration
  • Iterative Development
  • Shorter and Faster Development Cycles
  • Increased Developer Productivity
  • Enables Agile data integration projects
  • Works with unstructured and semi-structured data
  • Easy Access to cloud hosted data
  • Query performance Optimization
  • Less maintenance & management of integration systems
  • Complements Existing ETL data integration base
  • Extension and migration – not radical change

When to Use Data Virtualization

Data Virtualization can always be used when companies have data in disparate data sources or need to merge data with other data resources such as Social Media or External Data.  The questions really being asked here is if Data Virtualization can be applied to the Source data, or if the Source data first needs to be materialized before it can be utilized by anything, including Data Virtualization.

The following are examples of factors that make Data Virtualization a good candidate.  If too many of these factors sway in the other direction, Data Materialization may be required.

  • Time Urgency for Solution Implementation
  • Cost / Budget limitations
  • Unclear or Volatile Requirements
  • Replication Restraints
  • Risk Aversion of Organization
  • Network Uptime
  • Source System Availability
  • Quality Data Available
  • Source System Load
  • Minor Business & Data Rules
  • Availability of History in Source System
  • Data Freshness requirements
  • Small Data Query Result Sets

When not to Use Data Virtualization

The question here is more of determining if Data Materialization is required before any access to data can be performed regardless if Data Virtualization is used or not.  Data Virtualization can always complement Materialized data; it’s more of if Data Virtualization can go directly to source, or if source needs to be materialized first.

The following factors would tend to influence a project to materialize data for any type of access.

  • Low Availability of the Source Data (not available when needed for reporting) *
  • Heavy load already on source Data *
  • Poor Data Quality requiring Significant Data Cleansing
  • Complex Data Transformation Requirements
  • High Volume Result Sets for Data Consumers
  • Data Source is Multidimensional (Cubes)
  • History Not available in Source Systems

* For some of the Factors that will influence the need to Materialize data first, it should be asked if Data Replication can be a resolution to that rather than actually Materializing (ETL to Warehouse).

Data Virtualization Software

There are many Data Virtualization software vendors in the market today.  The following are a few that are leading the Market and identified by Analyst as the leaders in the Data Virtualization.  At this time, I have not researched cost nor server requirements for any of these vendors.

  • Composite Software
  • Denodo Technologies
  • IBM (InfoSphere Federation Server)
  • Informatica
  • Rocket

Many of these companies will be willing to provide a proof of concept before committing to a purchase agreement.

Summary

Data Virtualization has been around for many years, but is really becoming mainstream today.  It will however, become much more of the norm in the very near future.  It is no longer necessary to continuously copy data from location to location as different groups need data.  Data Virtualization allows an organization to make data accessible virtually to all different groups to utilize within the organization without physically storing it over and over again.   Data Virtualization will allow projects to be developed much quicker and allow them to change much faster as there is no ETL layer that requires database changes as well as code modifications.

For more information about implementing Data Virtualization in your organization, contact FYI Solutions.

 

Share
Aug 01, 2014

Tips for Growing out of BI Immaturity

Author: Jeff Busch

As I visit various clients and potential clients, and as I talk to colleges and network at events and conferences, I am amazed at the evidence I see that BI immaturity remains an issue. One would think that, by now, Business Intelligence has passed from buzzword into regular business practice and that everyone knows how to do it. This is clearly not the case. Sometimes it feels a little like we’ve given the keys to the family car to a 10 year old. However, there isn’t a lot of guidance available in public sources to help those who are just starting out in BI to do it right from the beginning. If you google “BI Immaturity” there are two results that are actually related to Business Intelligence. If you spell out “Business Intelligence” the results are only slightly better. So what’s a small business or corporate group to do?

There’s nothing wrong with immaturity, it implies that there is something to grow into. Maturity is possible, and it doesn’t require a history of failed projects to attain it. Although most fledgling BI initiatives will fail to reach their full potential, or completely fail all together, this is often not directly the fault of the software or solutions vendor. It is often due to the client asking too much from them without an understanding of what BI maturity involves. In this blog, I hope to speak to the small business or corporate group and provide a few tips to help guide these groups to start well so that they can finish well.

First of all, let’s define a few things. Since I am assuming that my audience might be just starting out in Business Intelligence it would be a good idea to explain what I mean. There are many definitions of BI, but I like to describe it as using data to produce information that provides actionable answers to business questions. This is different from data reporting. Reporting will give you a list of customers and what they bought last week. BI will show you which customers are the most profitable and what customer types are trending up or down so that you can appropriately adjust your marketing and sales practices.

When I talk about BI maturity, I am talking about the capacity to understand this connection between business questions, data, and information. Mature BI groups can look at an unanswered business question, determine what specific report requirements will help answer that question, and what data is required to fulfill those requirements. They understand that the quality of the underlying data is the key to good results and they are willing to put time and money into achieving a high level of data quality. Although many large organizations invest a lot in these initiatives, this seems to follow the typical 80/20 rule. 80% of the value comes from the first 20% of the effort; as a small business or independent corporate group, you can go far with a relatively small amount of focused attention on a few key things.

There are three things that all successful BI initiatives have in common. This list may not be comprehensive, but I think that these are the most important to focus on as you launch a new BI project. First, you need to ask the right questions. Chances are there is some business pain or business questions that have caused you to determine you need a BI solution. Many companies will start there, but when they actually launch the project, they lose focus and begin asking question like “What data do we want in the reports?” or “What is the executive dashboard going to look like?” These are important questions to ask later when actually designing the reports, but they should not be the starting point. The most successful projects have a small number of business questions clearly defined, and the entire project focuses on answering those questions. Good business questions are:

  • How efficient is our product supply chain? Where are the greatest cost inefficiencies?
  • Which are our most/least profitable customers and why?
  • Why do we lose opportunities to our competitors? Where can we improve our sales cycle?

These high-level business questions can then be broken down into more detailed business requirements such as: sales, cost and gross margin by customer, grouped by region and customer type, displayed year over year. These business requirements are critical to define early because they will determine what data is required, how it should be stored, and what tool will best meet your analysis needs.

After you have clearly defined your business questions and business requirements, you need to focus on your data. Many projects fail because little attention is paid to the data early on and it is assumed that because it looks OK in the source system it will work OK in a BI reporting system. Then, after all the work of designing reports is done, it is discovered that there are data quality issue such as missing data in required fields, inconsistent codes like 53 different state abbreviations or some names stored first last and some last first, fields used for multiple types of information, etc. All of these things can make well-built reports nearly useless to the business.

The data required by the reports needs to be clean, complete, and stored in a way that is efficient for reporting. This is not an afterthought; this should be a very high priority stage in the project and should be done before moving on to designing and building the actual reports. This also highlights the need for a dedicated place to store the data used by the BI solution. This is called a data warehouse or data mart and is where you will cleanup and process the data to get quality information from it. Although, large organization have large complex data warehouse, this isn’t required. They can be relatively small and inexpensive and you might even be able use open source database technologies like MySQL. A good solutions vendor will guide you down this path and place a high priority on getting things right at this point before moving on.

The third major element of successful BI practices is designing and building a solution, not installing a tool. Very few tools can handle all of the necessary elements of BI by themselves, and there is no tool that is equally good at all type of analysis. It is very important to design a complete solution to deliver the best results to answer your business questions and choose the right tool for the right piece of the solution. In some cases all of the tools should come from the same vendor for cross compatibility and intercommunication, in other cases individual functions should be handled by tools with a narrow focus and tools from multiple vendors might be chosen. The key is to choose the right tools for the right job, and not choose the tool and then build the solution around the tool. Another danger related to this is choosing a tool or building reports based solely on visual glitz and glamour. Good looking visuals are important, but the most important thing is to deliver actionable information in as clear a way as possible. Designing attractive visuals must support this goal, not be the goal in and of itself.

As with all things in life, BI maturity come with time and guided experience. By focusing on a few key things a small business or corporate group can quickly grow in their understanding of BI and how to use it effectively to improve their business.

For more information about getting started with business analytics, or to measure your current business intelligence maturity, contact FYI Solutions.

Share