Is Social Media Right For Your Small Business?

June 30, 2015 0 Comments A+ a-


Everyone from pre-teens to granddads, does social media today. With Twitter, Instagram, Facebook and many newer ways to stay in the know popping up every day, picking the right platform can be a difficult task. While the personal value of this modern convenience seems obvious to most, the task of proving the channel’s worth to a business can be very challenging.

With this said up front, many larger companies have taken the plunge anyway. According to a recent eMarketer survey, 88 percent of U.S. companies with 100 or more employees that were surveyed are using social media for marketing purposes. This figure is actually expected to rise slightly to 89.5 percent in 2016. Dell has been a leader in the use of social media for business and recognizes that it has value for more than improved brand awareness. This large global corporation has become especially adept with social media, learning how to meaningfully increase business sales and revenue.

A similar survey of 350 small businesses done by the research firm Clutch, however, found that nearly half of those organizations don’t actively use social media to promote their businesses and 25 percent say they have no plans to do so in the future. Do these numbers represent a business “social media gap”? Are small businesses missing the boat?

Strategize, then analyze the numbers

As a small business owner myself, this question is more than just academic. When I founded GovCloud Network over a year and a half ago, using social media for marketing and opportunity identification was one of my strategic planks. In my simplistic view, we would use Twitter to advertise new content posted on the company’s blog, Cloud Musings. The expertise and knowledge demonstrated by thought-leadership pieces would, in turn, drive our targeted customer segment straight to the company website.

After reading this study though, I began to second guess both the money and time investments. So
prior to finalizing next quarter’s budget, I decided that an objective and quantifiable evaluation of our social media program’s ROI was needed.

The good news was that because social media was in the plan from the very beginning, we already had site visit data from the initial launch of all our sites. The bad news was that, being a startup, we couldn’t afford any fancy customized social media tracking service. We could only use what was freely provided by Google Analytics and Twitter. Luckily those tools are very good, and we were able to pull robust data sets from two three-month periods, August-October 2014 and March-May 2015. That data enabled us to measure Twitter engagement rate and the number of company blog and website users.
Social Media Engagement Model


The engagement rate measures all clicks on a tweet, including retweets, replies, favorites, follows and link click-throughs. It is also used as a measure of how well a tweet resonates with its audience. By defining visitor segment characteristics in Google Analytics, we were also able to quantify how many users were members of specific market segments. The segments we selected corresponded to specific GovCloud Network business lines.

With this data and the use of relative percent difference as a comparative measure, the results were
stunning! Except for a reduction in the overall number of pages per session at the company website, our documented increase in Twitter engagements drove increases in the number of blog and company website users. This trend was also seen across all of our customer segments

Give social media a try

Does this mean that social media is right for your small business? No, it doesn’t, but it does show how a small business can monitor, quantify and analyze the ROI of a social media strategy. My best advice is to not ignore the value that social media can bring. Using these services strategically does take an investment in time and sometimes even a little money, but the value could be significant. Have an open mind, plan a pilot, give it a little time, and don’t forget to do the numbers.

Social Media Engagement Results


(This post was written as part of the Dell Insight Partners program, which provides news and analysis about the evolving world of tech. To learn more about tech news and analysis visit PowerMore. Dell sponsored this article, but the opinions are my own and don’t necessarily represent Dell’s positions or strategies)



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)



Rob Davies, ViON SVP, Talks about Government Cloud Computing

June 22, 2015 0 Comments A+ a-

ViON solves complex enterprise problems by combining passion and agility to deliver the most effective, innovative solutions because commitment to mission success is in their DNA. One of the ways they deliver success is through ViON on Demand™, which delivers highly secure compute, network and storage capabilities delivered through on-premise private clouds.  ViON on Demand supports a customer whose business strategy is to consume IT infrastructure as a managed service. Through ViON on Demand, ViON’s customer can procure and consume a range of IT hardware and software suited to their specific needs (compute, storage, data center networking). This strategy helps them:
  • Use technology on-premise, like private Cloud;
  • Customize technology, vendor and configuration based on specific needs;
  • Scale up and down to meet demand without penalty or minimums;
  • Pay with operations dollars rather than capital expenditure;
  • Achieve best-practice, customized service-level agreements (SLAs); and
  • Enjoy 24/7 live, secure support when needed.  
The executive responsible for managing this business is Rob Davies, Vice President ViON on Demand.  I had the opportunity to meet him at the ViON headquarters building in Herndon, Virginia for a discussion on government cloud computing.

Kevin:Thank you very much for the opportunity to speak with you about cloud in the US government.  To start off, what is your position here at ViON?

Rob: Thank you Kevin for coming out to visit us.  I am the Executive Vice President of Operations here at ViON and also have the responsibility of managing our On Demand cloud solutions.

Kevin: Being responsible for ViON’s cloud computing solutions seems like a pretty demanding task. How is that going?

Rob: Cloud computing in the US Government marketplace holds great promise, but yes, it also presents a demanding challenge. As you know, the US Federal marketplace has been a budget
constricted environment for quite a few years but that environment is actually good for cloud computing because it has forced agencies into looking for better ways to do information technology. Here at ViON, we’ve actually benefitted from that.

Kevin: That sounds pretty interesting.  Can you please elaborate on that a bit?

Rob: Sure. In observing agencies that are looking to find better and more efficient ways to do information technology, they have really needed to figure out how to use cloud within their existing organizational structure.  This is more difficult than it appears on the surface because government IT organizations are typically structured around a horizontal view of an IT infrastructure.  That means that all their processes and decisions are aligned with IT operational layers. The server team makes decisions on servers, the storage team makes decisions on storage, the application team makes decisions on applications and so forth. This organization also drives budget allocations and decision along those same operational layers. This horizontal viewpoint doesn’t work well with cloud computing because budget decisions need to be more aligned with mission, workload and application characteristics. To do this properly the organization needs to adopt a more vertical view of the IT infrastructure.

Kevin: How have ViON’s cloud computing customers dealt with this problem?

Rob: Though our professional services support, ViON has been able to help its customers elevate their organizational viewpoint. This has enabled them to figure out how to use cloud effectively without changing their existing organization. In a way we have collaborated with our customers and now know how to do cloud within this traditional componentized organizational structure.

Kevin: How is that done? Many have said that cloud computing is nearly impossible without changing existing policies or getting FAR (Federal Acquisition Regulation) waivers.

Rob: The first step in the transition is to get legacy infrastructure people more familiar with cloud consumption models. You also need to move them away from a focus on the technical specification of the infrastructure. In my experience, the expertise of government IT professionals is very high.  The only issue is that organizationally, they are forced to see cloud as an extension of the infrastructure component that lies within their responsibility. Storage people can deal with storage-as-a-service but they have no authority to link a server or application with that storage. Once the infrastructure team collaborate with a vertical viewpoint they can then builds a common lexicon for the solution that’s being design. This, in turn, will drive organizational changes that are friendlier to more efficient consumption-based IT service models.

Kevin: What about the budgeting models? Aren’t they still based on IT components?

Rob: Yes and most federal agencies are way behind in that area. It is, however, a bit easier in the DoD because of the use of working capital funds. This budgeting construct was designed as a means for dealing with the wide variability of the DoD mission. This budgeting variability can be equally used for cloud services. There is no widespread corollary on the civilian side. Civilian agencies have a willingness to adopt cloud, but the acquisition challenges and the lack of a working capital construct make it more difficult.

Kevin: So how can ViON help agencies get over this hurdle?

Rob: ViON has experience in helping agencies learn how to manage a traditional fixed budget in an environment that has variable purchase requirements. Options include ordering agreements and blanket purchase agreement. These have more funding flexibility than direct award contracts. We can also determine appropriate workloads for cloud migration, help in analyzing the budget process around those specific workloads and assist with documenting and forecasting capacity needs. Although peak capacity requirements will certainly be in the budget, that money may come back if the capacity is not actually needed.

http://www.vion.com/Agile-Cloud-Solution/Agile-Cloud-Platform.aspx
Kevin:Are you arguing for changes in government procurement rules?

Rob: Not really. Procurement rules don’t need to be changed but more flexibility needs to be allowed.  COTRs and Contracting Officers just need better tools for purchasing cloud. For example, an ability to pool funds across infrastructure or multiple mission areas would go a long way.

Kevin: You’re really arguing then for a more holistic view and increased visibility of IT within the government. Neither one of those are part of government culture. How do you see this happening?

Rob: Change is hard and cloud computing defines a hard change. To be successful in this, government agencies need to tap the knowledge of government IT infrastructure professionals and make them an integral part of the process. Those professionals know their agency’s mission and how best to manage this change. Unfortunately, in the past, they have been the last to know about an application or system was being funded and built. The government can absolutely do it but very strict restrictions on how money can be spent may need to be changed. Property and use tax payments are a case in point.

Current tax payment rules are driven by ownership. When the government uses cloud services the CSP (Cloud Service Provider) stills owns the equipment and the FAR is silent on this type of situation. Restriction on the use of different colors of money may also need to be addressed. Today the CIO doesn’t have any budget authority. FITARA (Federal Information Technology Acquisition Reform Act) was designed to help in this area and we can only hope that Congress can see a way forward in helping the CIO get away from management through influence towards being able to manage with authority.

Some of the new vehicles are more structured for cloud with dedicated acquisition shops. This will help the rest of the acquisition community come along.

Kevin:  Any advice for those CIO trying to tackle the challenge of transitioning to the cloud?

Rob: We’ve coached our customers to look at the total acquisition process. When initiating a consumption based IT contract, allow for time to transition from one contractor to another. Since the vendor needs to be able to make and recoup their investments, contracts tend to be longer and the government needs to be able to scale up with a new vendor slowly. This approach maximizes the value to all parties.  A total acquisition process view also reduces contract churn, contract related technical evaluations and reduces overall acquisition cost.

Kevin:  In wrapping up, what is the health of cloud in the government. What is your prognosis with respect to the future?

Rob: I am really optimistic. It will take a lot more time but we will get there. Mainframe won’t go away, neither will cloud. We will get there because there are more offerings in the market, more variety, more flexibly, better acquisition models and cross pollination across the government.

Kevin: Thanks Rob.


Rob Davies explains ViON On Demand

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)



New Approaches for New Big Data Insights

June 15, 2015 0 Comments A+ a-

by Melvin Greer

Business Intelligence has matured as a core competency necessary to sustain competitive advantage. Organizations of every size and industry are generating valuable data with each interaction, and that data can be captured, analyzed, and turned into business insight. These organizations are using analytics features like dashboards, advanced visualization, data warehousing, and other technologies to achieve their strategic business objectives.
Many companies are taking a hybrid cloud approach to data analysis. Leveraging a hybrid cloud environment as part of a big data analytics strategy enables businesses to take advantage of cloud elasticity. This allows organizations to process data across clusters of computers, enabling analysis to occur across multiple cloud compute environments. As organizations’ need for more compute power grows, the cloud can scale with their requirements. Cloud-based business analytics capabilities enable organizations to make smarter decisions that better address real-time business imperatives.

Analytics capabilities are moving beyond the traditional business intelligence, and forward-leaning organizations are analyzing data in new ways to distance themselves from the competition. Analytics is enabling businesses to align the right customers with the right solutions, identify customer patterns of behavior, and quickly resolve customer service issues by correlating and analyzing a variety of data.
- See more at: http://data-informed.com/new-approaches-for-new-big-data-insights/?utm_content=16304271&utm_medium=social&utm_source=twitter#sthash.TF7nPglQ.dpuf
by Melvin Greer
by Melvin Greer
by Melvin Greer
Business Intelligence has matured as a core competency necessary to sustain competitive advantage. Organizations of every size and industry are generating valuable data with each interaction, and that data can be captured, analyzed, and turned into business insight. These organizations are using analytics features like dashboards, advanced visualization, datawarehousing, and other technologies to achieve their strategic business objectives.

Many companies are taking a hybrid cloud approach to data analysis. Leveraging a hybrid cloud environment as part of a big data analytics strategy enables businesses to take advantage of cloud elasticity. This allows organizations to process data across clusters of computers, enabling analysis to occur across multiple cloud compute environments. As organizations’ need for more compute power grows, the cloud can scale with their requirements. Cloud-based business analytics capabilities enable organizations to make smarter decisions that better address real-time business imperatives.

Analytics capabilities are moving beyond the traditional business intelligence, and forward-leaning organizations are analyzing data in new ways to distance themselves from the competition. Analytics
is enabling businesses to align the right customers with the right solutions, identify customer patterns of behavior, and quickly resolve customer service issues by correlating and analyzing a variety of data.

The Analytics Toolbox

Descriptive analytics, used to explain what has happened and what is happening now, offers organizations a context-relevant view of the business with discovery, visualization, and interaction capabilities. This enables businesses to examine critical performance metrics to understand business impact, evaluate business processes, and drill down into data to get a single view of the past and the present. Bringing together structured and unstructured data from across the enterprise, descriptive analysis provides a consistent view of key business metrics in real time. It shows the “hard to see” trends and patterns through data dashboards, reports, and visualizations that not only detail what is happening, but also begin to diagnose why. Descriptive analytics provides insight that details the reasons behind the results.

Diagnostic analytics, which explains why things happened, is critical to improving business operations and processes. Diagnostic analytics starts during the descriptive analytics phase and gets into root-cause analysis and data discovery. Exploration into disparate data coming from many different sources feeds interactive visualizations that can uncover patterns and correlations that drive business-specific predictive models. Businesses can use these correlations and relationships to predict and plan for the future by identifying key factors that directly and indirectly affect performance. The insights gained provide a deeper understanding of why the business is performing the way it is. Diagnostic analytics reveals success factors that drive future growth.

Predictive analytics, which uses data mining and descriptive and diagnostic analytics along with predictive modeling to describe the probability of future business events, finds patterns in historical and real-time data to anticipate what is ahead, and identifies business risks and opportunities. Using predictive models, statistical analysis, data mining, real-time scoring, and a range of advanced algorithms and techniques provides organizations the ability to analyze business trends and relationships in current and historical data to forecast future business results. Predictive analytics helps organizations make better-informed decisions based not only on what has happened, but also on what is most likely to happen in the future.

Prescriptive analytics uses predictive models and optimization techniques to recommend reasonable courses of action and shows the expected outcome of each. Organizations make more informed business decisions in real time by evaluating different ways to move forward, with an understanding of the consequences of each option. Using “what if” scenarios, predictive models, rules, and decision logic enhances decision making by evaluating a variety of viable options and their likely outcomes. Organizations learn to take advantage of an opportunity or mitigate a future risk by understanding the implications of their actions. Prescriptive analytics drives organizations’ confidence in the ability to make the right decisions to meet strategic business goals, improve customer engagement, lower risk associated with threats and fraud, and improve business processes.

Using Cognitive Analytics

Machine learning and natural language processing have moved out of research labs to become a business differentiator – fusing the benefits of internet speed, cloud scale, and adaptive business processes to drive insights that aid real-time business decision making. Cognitive analytics is the application of cognitive computing technologies to enhance human decision making. Organizations improve the ability to sense and respond by applying cognitive analytics to harness the power of the big data. Cognitive analytics assists in the processing and understanding of big data in real time, in the face of the ever-increasing volumes of data and the endless amount of data fluctuations in form, structure, and quality. Cognitive analytics can extract content, embed it into semantic models, evaluate hypotheses, and interpret evidence, providing potential insights and then continuously improving them.

Business intelligence and analytics tools are the enabling technologies that provide information, knowledge, and insights to organizations assisting them to make better decisions. It helps organizations understand where money is being spent and the measurable value they are realizing from that spend. The use of analytics has surged as smart organizations harness the power of big data to improve decision making and efficiency, and the elasticity of a hybrid cloud computing environment can ensure that these organizations’ ability to manage, analyze, and interpret data grows along with the amount of data they collect. To succeed in today’s competitive market, business leaders must be able to turn data into business insights for sustained business growth, and the right computing environment is a key element to success.

Melvin Greer is Managing Director of the Greer Institute for Leadership and Innovation, focused on research and deployment of his 21st Century Leadership Model. With over 29 years of systems and software engineering experience, he is a recognized expert in Service Oriented Architecture, Cloud Computing and Predictive Analytics. He functions as a principal investigator in advanced research studies, including Nanotechnology, Synthetic Biology and Gamification. He significantly advances the body of knowledge in basic research and critical, highly advanced engineering and scientific disciplines. Mr. Greer is a Certified Enterprise Architect, the Steering Committee Chair of the Cloud Standards Customer Council and a member of the U.S. National Academy of Science.

by Melvin Greer

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)



How to Put Public Sector Data Migration Hassles on the Road to Extinction

June 12, 2015 0 Comments A+ a-



With careful planning and the right technology, Federal, State and Local Government IT Leaders can overcome fears of data migrations, breaking free from archaic procedures to lead the pack 

By David Wegman, Vision Solutions, Senior Vice President, Integrated Accounts




Jurassic World, the latest installment in the Jurassic Park film series, opened this week – and there’s a lot of hype surrounding the premiere as fans immerse themselves in a world of Mesozoic Era-inspired fantasy. While the creatures that make the theme park their home are strikingly realistic, their real-life counterparts became extinct millennia ago.  Many believe that the once mighty dinosaur population fell in large part because it failed to evolve with the changing world around it. Public sector institutions face a similar plight today, especially as technology advancements demand they constantly evolve in order to keep up.

Much like the dinosaurs fought for survival, governmental organizations must fight for resources. They must embrace change in order to thrive, and part of that involves modernizing systems, streamlining processes and migrating vast amounts of data. However, many organizations postpone such work due to uncertainties about the impact and technology risks associated with these procedures, including the inherent downtime associated with most migration methodologies.

Many public sector CIOs and IT leaders are concerned about the fallout from failed migrations, which are a painful waste of time and resources. And their concerns are not unfounded: In its 2015 State of Resilience report, Vision Solutionsrevealed that 36 percent of respondents had experienced a migration failure. While failures are a relatively common occurrence, they are not inevitable. A thorough planning process and the right resources go a long way to improve the chances of success.

Regardless of the reason for a migration, significant complexity and potential pitfalls litter the path from point A to point B. In addition to understanding the migration process and identifying who is going to do the work, users must assess downtime’s potentially negative impacts.

The fact is, migrations are complex; even those that sound simple are inherently complex. Because most servers and databases are not single instances within a data center but interconnected to other systems and databases, including mid-range and mainframe systems, there is immense variety that complicates migrations today. These various platforms need to be coordinated in migration waves to mitigate their impacts.  This all presents complexity – and complexity presents risks.

The first task toward successful migrations is to map out a thorough migration plan beforehand: IT needs to determine what they are going to migrate, what it is all connected to, who is going to do the work and when they can get a “migration window” from their unit, as well as how they will navigate around the many real-time issues that may arise along the way. Planning ahead for potential issues gives IT clarity on factors that may affect the migration process, allowing the migration team to address problems in advance and in real-time.

All migrations need testing before deployment; and testing further contributes to the time and resource-intensive nature of migrations. Traditional pre-migration testing can take anywhere from a couple of weeks to a couple of months, depending on how complex the applications, databases and server inter-connections are in the data center. The process typically involves halting production periodically to take snapshots of data and testing those snapshots.  Each time the process is completed, IT must restore the database and start over. This typically involves multiple test runs and multiple cycles within the organization. The entire process can take anywhere from hours to entire days depending largely on the factors outlined above.

IT often faces an uphill battle in convincing leadership to agree to a migration project due to downtime risks and impacts. This can cause substantial delays, compounding the migration’s complexity. Elected and appointed officials may not always have exact numbers on hand, but they do realize that downtime is costly. Nineteen percent of respondents in the Vision Solutions 2015 State of Resilience report indicated that the cost of downtime ranged from $10,000 to $50,000 per hour. Fearing the steep costs of downtime and associated risks, leadership may hesitate to green light migration projects, preventing successful execution.

How can public sector IT leaders address this problem? Government IT leaders need more than ever to seek out trusted partners with a track record of helping organizations like them accomplish important migrations and system upgrades. Rather than reacting to perceived risks, limited expertise or cost of downtime, they should be confident and proactive, following best practices to set them up for success. Organizations that embrace proven technology and methodologies will be well-positioned to realize the full benefits of smooth migrations.

Uncertainty, risk and extended downtime don’t need to be migration realities. By working with a trusted partner and utilizing modern technology and methodology, public sector IT leaders can achieve near-zero downtime during migrations, minimizing impact on the organization and users. But to do so, they should consider the following when selecting a migration solution:

1. Real-time replication is paramount: Organizations should look for solutions that offer the most flexibility and currency of data possible while minimizing impact to users during testing and migration. This typically requires a software-based solution that replicates any activity taking place on the production server to the target server in real-time, allowing IT to keep the production server up and running rather than freezing it or periodically pausing it for snapshots. The production server remains fully functional, data is as current as the last transaction and users continue working. IT can test applications on the new server, and prove the migration methodology and plan, without impacting the production environment. Ultimately, this makes IT more productive on other tasks with improved uptime – all while migration is taking place.

A second consideration is how to take the distance from production server to target server out of the equation. Because real-time replication sends the changes as they occur, it minimizes the amount of communication line and distance becomes less of an issue. When coupled with compression and throttling in a product, this creates a high degree of efficiency.
Finally, because databases and servers are maintained in sync at all times, IT does not need to freeze production and wait for final validation of the testing server to finally perform the migration. Weekend migrations are no longer the norm as the switch to new environments can occur at any time the organization is ready and take place in as little as 20 minutes– a notable improvement over switch times in traditional migrations.

2. Unified consoles simplify the process: Another feature government IT leaders should demand in their migration solution is a unified console that allows IT to work on all types of migrations with a common workflow across operating systems and platforms. This provides a major advantage as it mitigates the need for different skillsets typically required for different types of migrations by platform or workload.

While IT staff certainly needs to understand the underlying architectures and databases, by using a uniform console and workflow, it reduces training time and maximizes the existing team’s skillset. A single operator can perform parallel migrations across multiple platforms after product training sessions, minimizing the drain on resources.

3. Consolidating migration streams delivers faster execution: Simultaneous executions also ease the impacts to the organization. A solution that allows users to run parallel streams of migrations saves organizations significantly more time than traditional methods. This method facilitates near-zero downtime, shortens time to completion, mitigates costs and frees up IT resources to focus on other projects.

4. Automation minimizes risk: Traditional methods typically require a fair degree of manual work, which equals a higher degree of risk.  While no migration can happen without people, solutions that reduce the required amount of human interaction from migrations via automation diminish risk. This is very important for organizations to keep in mind, as failing to choose a solution that provides APIs the ability to automate as much of the work as possible introduces additional human interaction and therefore risk. This lack of automation often results in failed migrations, migrations that run over budget or last longer than expected.

5. Hardware- and software-independent solutions enable flexibility: Every server is different and topologies change rapidly. Migrating across server types, chipsets, storage devices, databases, versions and the like all need to be addressed in a migration plan. A hardware- and software-independent solution reduces the risk potential in these areas. This model allows users to migrate data seamlessly from any one type of environment to another. The options are virtually endless – from physical to virtual to cloud across any operating system, chipset or storage device.
Using platform-independent technology makes many scenarios possible including migrating between storage from different vendors, migrating to a server located anywhere in the world, consolidating servers with many-to-one migration and moving operations to a new data center across extended distances with very little downtime.

Evolve Continuously to Achieve Migration Success
While data migrations will always entail a certain amount of risk and downtime, modern solutions have greatly improved chances for a positive outcome. IT leaders at Federal, State and Local governments and those at their agencies that act confidently, instead of out of fear, to take on important migrations for their organization will come out on the top of the food chain, evolving to thrive to the benefit of both their organization and the taxpayer.  
 


( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2015)