Media enthusiasts, including gamers perpetually seek super-charged PCs that feature the powerful combination of high-speed processing and graphics that impress. Dell's XPS 8300 delivers all of this, making it the ideal partner for entertainment gurus and gamers that immerse themselves in their favourite movies, games, music and much more. With the availability of Intel's Core i7 processor option, Dell's XPS 8300is designed to give you the best performance in its class. Digital media creating (video editing, encoding, image rendering, audio production and real-time preview) is quicker and faster than ever before.
The new Dell XPS 8300 offers uncompromising quality and powerful performance
. Up to 16GB DDR3 memory
. Up to 4TB storage dual drives
. Optional USB 3.0 SUPPORT
. 460W power supply
Says Nitesh Devanand, Dell Consumer Product Specialist at DCC, "This desktop is perfect for any multimedia task, there is nothing you can't do on this desktop it has endless capabilities and this is the new generation of the 8300. It features an improved processor, bigger hard drive and of course high-end graphics making it ideal for users who need extreme power and easy accessibility."
With the integrated 7.1 THX TruStudiosound is truly amazing providing a heightened experience with every explosion and every whispered noise. This PC is designed to bring the stellar audio that is really only experienced in live theatrical and musical productions.
Not only does the Dell XPS 8300 provide outstanding proven power but it is also distinctly stylish with its tilt-back design and top-recessed tray that is both appealing to the eye and provides easy accessibility to a multitude of ports making it perfect for charging a variety of electronic portable gadgets. Featuring a modern yet sophisticated look with a black bezel and white sides, the XPS 8300 is a PC that you will not want under your desk but rather show off on top of your desk.
Dell Stage software delivers easy navigation of the Operating System (OS) and provides ease of access to all your favourite movies, games, photos and videos with only a mouse click away. Create photo slides, read online e-books and listen to a variety of music all from the convenience of the stage. Download your favourite applications, share your photos on Facebook and Flickr options within PhotoStage.
Dell's XPS 8300 Multimedia Desktop is available at all major retailers at the recommended retail price of R11, 999.00 inclusive of V.A.T.
There is a shift that is driving companies and employees to work remotely; enhancing productivity by unshackling the restrictions that working from a fixed office imposes. It frees employees up from traffic and most importantly, allows for work to be done 'on the fly'. This is no different when it comes to IT support. Businesses are not only realising the benefit of outsourcing their IT support, but they are also able to leverage the additional benefits of remote support and monitoring that is gaining traction in the South African market.
There are three types of remote support: in-house support sees the company's head office provide remote support and monitoring to its branches; the fully outsourced model lets a third-party handle support and monitoring from its premises; and the co-sourced model puts in place a centre of excellence that is co-built by the client and the outsourced service provider. All models have their benefits but the latter two deliver significantly more.
Time and travel = money & emissions
One of the most obvious benefits of remote IT support is that it reduces travel costs and travel time. It takes time and costs money for the support resource to get to the client when a call out is logged. With remote support in place, however, the exercise is simplified. The system is monitored and analysed remotely and the problem is fixed immediately. Thus, traffic and distance between the outsource provider and the client is no longer an issue. In addition, and this is not an afterthought for most companies any more, remote support and monitoring allows companies to decrease their carbon footprint and lessens their impact on the environment. It also provides a far speedier response and resolution for companies that have branches in outlying areas or even different countries.
Compliance to standards
Standards and best practices have become the norm in today's digital world.
However, enforcing them is no easy task. It's particularly tricky when clients have remote branches and offices. Individual IT resources at these remote branches and offices further exacerbate this challenge as more often than not they create their own 'rules', thwarting the efforts of the business to retain consistency and remain compliant. Having remote support and monitoring allows businesses to create a 'golden' standard, one which is rolled out and adhered to by all remote sites and branches, thanks to centralised and remote control.
This benefit became evident for one of RDB Consulting's large multinational manufacturing clients. With its head office based in the Western Cape and branches throughout the country as well as some parts of Africa, compliance became a top priority. RDB Consulting's remote support and monitoring service allowed the company to enforce security standards, use of standardised site documentation, processes and procedures, as well as drive compliance to the Service Level Agreement (SLA) which ultimately assisted them to meet compliance of a variety of Acts.
Time to resolution
Disasters are never planned. Downtime, especially in a database environment, impacts operations -- and the longer the database or a critical application is down, the greater the impact on the bottom line. Resolving the problem quickly is therefore of the essence. Remote support and monitoring is a great enabler of speedy resolution. More importantly, because it's proactive
in nature, it often prevents problems from occurring in the first place.
Control the changes in your organisation Remote support and monitoring also effectively manages change control more efficiently. For example, creating printer rights on the corporate network can be handled remotely, ensuring user rights are applied according to policy.
Access to skills
Outsourcing support and maintenance of the IT infrastructure already delivers the benefit of access to skills and experience. Remote support and monitoring further enhances this as a team of highly skilled resources is monitoring the IT environment and the appropriate skills are at hand - immediately.
So what investment is required to set up remote support and monitoring? All that is needed is a dedicated server. However, as impetus grows with regard to adoption of various cloud computing models, a capital investment might not even be necessary as companies can leverage the benefits of a pay-as-you-use model.
If you already have an outsource provider handling your IT support and maintenance, you may think you already have remote support. The answer is quite simply "no". Many outsource service providers only offer on-site support.
Although remote support and monitoring has many benefits, it is important to understand that this type of support complements on-site support. Without onsite-support, the outsource provider would never be able to forge the relationship that face-to-face interaction enables. In addition, onsite support allows the outsource service provider to gain a deeper understanding of the company culture whilst keeping a finger on the pulse of the business.
This allows the outsource provider to integrate seamlessly with the company, creating a better 'fit'.
Therefore, in light of the above benefits of remote monitoring and support, businesses should look towards augmenting their onsite outsource contract with this type of service. The time to consider is now.
About RDB Consulting
Established in 1995, RDB Consulting is an outsourcing and consulting company that specialises in four areas: Relational databases, Operating Systems, Monitoring and Enterprise Resource Planning. The organisation also offers project management, solutions architecture, and on-going maintenance and support. Our services are designed to provide businesses access to expert technical resources, whether full time, part time, co-managed or via remote administration. This allows companies to focus on their 'core' business and leave their ICT issues to the experts.
Distributor Drive Control Corporation (DCC) has recently unveiled the newly revamped HP 630 and 635 notebooks. This new range is ideal for students, cost conscious consumers and small businesses, providing mobility and enhancing productivity right out of the box with preloaded Microsoft Office
2010 Starter Edition on both models.
The HP 630 and 635 Notebooks are easy on the eye with their smooth matte surface in an understated pewter colour, providing an elegant yet high-class look and feel. The HP 630 is powered by Intel's Core i3-370M processor and the HP 635 is powered by AMD's Dual-Core E-450 processor, ensuring that these notebooks can perform multiple tasks quickly and easily. With the energy efficient 15.6 inch diagonal LED backlit display, Wi-Fi and Bluetooth, working from home or your favourite hotspot is easy and hassle free.
Says Deon Botha HP PSG Business Unit Manager at Drive Control Corporation "The HP 635 and HP 600 series notebooks are perfect for cost conscious buyers that don't want to sacrifice functionality. In addition, these notebooks are ideal for communication with its built in webcam, Wi-Fi and Bluetooth. Connecting with other devices such as your Smartphone or making Skype calls to friends and family is as simple as a touch of a button."
The HP 635 notebook comes pre-loaded with Microsoft Windows 7 Basic Operating System (OS) making it perfect for the home user to surf the web and perform various online tasks. The HP 630 Core i3 model comes pre-loaded with Microsoft Windows 7 Pro OS making it perfect for small business computing generally requiring advanced networking features included in windows 7 Pro.
Botha adds, "Many users find themselves in the situation where they have purchased a notebook with 'feature bloat' technology where extras that you won't necessarily use inflate the price of the product. The HP 635 and HP
630 range offers the essential mix of features to enable usability while still staying within budget without comprising essential functionality."
Not only do the HP 630 and 635 notebooks both provide performance but they also deliver an edge for entertainment enthusiasts with its stereo Altec Lansing speakers delivering an extraordinary listening experience. Further extending the entertainment value are multimedia control features on the keyboard giving you the ability to play, forward and rewind audio and video files easily.
Experience truly remarkable high-definition (HD), with the HDMI port that lets you connect directly to any high-definition display. With the Fast Charge feature, recharging your battery is super fast and simple, taking just 90 minutes to recharge the primary battery up to 90% when you notebook is off.
Both the HP 635 and 630 models feature 2GB of DDR3 Ram and 320GB hard drives with optional warranty upgrades to enable onsite warranty response.
The HP 635 and 630 notebooks are available at all HP resellers and major retailers at the recommended retail price of R4, 699.00 and R 7099.00 inclusive of V.A.T. respectively.
New IT implementations are typically fraught with challenges, and prior to awarding any project to a technology partner most organisations go through an incredibly stringent selection process. This initial selection phase tends to be an extensive, time consuming and expensive process, and can take months at a time, if not longer, to complete, particularly when going out to tender.
Despite this careful planning, however, a distressing number of IT implementations continue to fall short of expected benefits and returns after projects have been awarded. The blame for this is often squarely placed on the shoulders of the technology and implementation partner, and unfairly so in many instances. The truth is that ‘failed’ implementations can be the result of a combination of factors and pitfalls, all of which can be avoided to ensure that the implementation of a variety of IT projects is a success.
One of the most common mistakes businesses make is failure to follow through. Once the meticulous selection process is completed, the selection team ‘retreat’ from the project completely and simply leaves it in the hands of the technology partner. Considering that the selection process can take up to a year to complete and often involves great expense and the involvement of high level executives as well as a team of evaluators, this lack of follow through to the implementation stage does not make sense given the initial investment into selecting a vendor.
Another issue that can lead to less than satisfactory IT implementations is the fact that the entire selection process becomes a mechanical box-ticking exercise that is completed for the sole purpose of the risk of the decision maker. While following due process and due diligence is necessary to prevent unnecessary risk there is a fine line between covering bases and ticking boxes on a checklist and ensuring that the solution chosen is the best fit for the company. Selecting a solution based purely on its ability to meet a checklist and not based on fit with the needs of the business can create a ‘disconnect’ between expectations and the solution that is ultimately delivered.
When it comes to new implementations, often the need for change management is also overlooked. Important role players who will be required to use the new solution are not identified, and the need for communication with these and other involved parties is ignored. This often leads to resentment on the part of these employees, who will then not use or embrace the new IT implementation, leading to its classification as a ‘failure’ even though the vendor may have delivered everything that was requested and required.
The decision-makers on new IT implementations are also often not the people who will actually be using the solution or who will benefit from it. The IT department is tasked by business with sourcing a technology solution, which is then done and handed back over to business, after which it is often left to the vendor to resell and explain the selected solution to the business, because the business users have not actually been involved in the selection process.
The length of the selection process itself is a hindering factor as well.
Because the decision-making takes up much of the time given for the selection and implementation of a project, the implementation time is shortened, which means that vendors may have to cut corners to deliver the solution within the specified time frame. During the lengthy selection process there is also the possibility that requirements may change and new versions of software may be released, which changes the game once again.
In general, the pitfalls of IT implementations can all be traced back to the selection process. The fact is that the amount of planning that goes into selection is not followed through in the implementation stages, which is problematic since the actual implementation is where the risk ultimately lies. The selection process extends the sales cycle unnecessarily and often the resources used to conduct this selection are disproportionate to the value of the implementation. Ultimately, many IT implementations fail as a result, even though the company may well have made the right technology decision, and the technology itself and the implementation provider often bear the brunt of the blame.
In order to avoid these pitfalls, the selection and implementation process needs to become more balanced, with equal weighting and focus given to both the evaluation and implementation phases. The evaluation team should also be empowered to make decisions on behalf of the business, and more business-oriented people should be included on this team, to ensure that the traditional disconnect between IT and business is kept to a minimum.
Communication between IT and business should also be improved.
The risk and materiality of the spend should also be identified up front, and the selection process scaled up or down accordingly to ensure that resources are not wasted on months of selection and evaluation on relatively minor implementations. To sum it up, the meticulous planning applied to selection and evaluation needs to be followed through to implementation, and realistic deadlines need to be set, to ensure that IT implementations live up to the expected results and benefits and can deliver value to the organisation.
In a world where digital information is more than doubling in size every two years, storage technology is experiencing a myriad of challenges to keep pace with the increasing demands for efficient and cost-effective methods to create, capture, manage and save all of this data.
Fortunately, computer storage technology is evolving with hard disk drives
(HDDs) at its core. Multiple streams of future innovations are being explored by scientists and engineers to evaluate possibilities as the path is paved for tomorrow’s standards – and volumes. Looking at some potential scenarios, global storage leader Western Digital gives some thought to the future of hard drive technology.
Digital content generated in 2010 alone was more than all the data created in the previous 5,000 years1. Given the 1.8 trillion gigabytes in 500 quadrillion “files” on track for this year, the digital universe represents nearly as many bits of information as there are stars in our physical
Breaking the ZB Barrier
Industry analysts estimate that 75% of universal digital content is a copy2 which is contained in an IDC report titled “Extracting Value from Chaos,” on the exponential growth of the digital universe. Some statistics cited
• Despite the global recession, in 2010 the digital universe set a record, cracking the Zettabyte (ZB) barrier, and is expected to grow to 1.8 ZB this year. (For a visual image, this would look like a stack of DVDs reaching from the earth to the moon and back).
• By 2020 our digital universe will be 44 times as large as 2009.
(Metaphorically, the stack of DVDs would now reach halfway to Mars).
In terms of sheer volume, 1.8 ZB of data is equivalent to:
• Every person in the USA tweeting 3 tweets each minute (4,320 tweets per day, per person) for 26,976 years—non stop3.
• Or, over 200 Billion HD Movies (each 120 minutes long) and it would take one person 47 million years of 24/7 viewing to watch every movie3.
• Storing 1.8 ZB of data would take 57.5 Billion 32 GB iPads.3
With the world’s electronic data expanding so rapidly, moving to surpass 2.0 ZB, driving factors include:
• Digital Lifestyles
o Home Entertainment (streaming and personal content creation) o Social Media o Mobile Phones • Security and Surveillance • Cloud Computing
Anything stored on a smartphone or tablet is duplicated, backed-up and/or stored somewhere else and as more companies engage cloud services, the need for storing and securing data will only increase. Organizations storing important data in cloud platforms have considerable redundancy, backing up information in multiple data centers. Online communities and services such as Facebook, Flickr, Google+ and Twitter are backed up as much as three times. Interestingly, much of the data stored in the digital universe is not user-generated—it is shadowing technology—data created around user transactions and preference queries (“Maybe you would also like…?”) following online purchases and continuously updated streams of analysis stored, directed and accessed in the vortex.
Storage Companies Unite
While software and component supplier companies strive to develop their own solutions to cope with the digital explosion, competitive storage hardware manufacturers (Western Digital (WD), Hitachi Global Storage Technologies (Hitachi GST), Marvell, Seagate Technology, Xyratex, LSI, Texas Instruments, Fuji Electric, Veeco, Intevac, KLA-Tencor, and Heraeus) have joined forces to address these huge challenges.
The International Disk Drive Equipment and Materials Association (IDEMA) formed a special forum, Advanced Storage Technology Consortium (ASTC) to mindshare technical evolutions for next-generation hard drives. With demand for greater capacity and faster speeds always constrained by price, the processes for researching and developing future storage technologies could wind up too costly, risky, or unmanageable for any single drive manufacturer. Under ASTC, cooperative endeavors are in motion to collectively establish common specifications for future platforms.
As capacity and density requirements spiral upwards, storage companies face barriers with current technical standards. Building the bridge to future technologies presents challenges for both traditional drives and solid state drives (SSDs). For SSDs, as NAND flash reaches semiconductor limits for lithographies below 1X nanometers, new technologies such as Vertical NAND or 3D Stackable NAND are striving to extend NAND flash technology. Other technologies contending to succeed NAND include: 3D Resistive RAM (RRAM), Phase Change Memory (PCM) and Spin-Transfer Torque Magnetoresistive RAM (STT-MRAM).
Hard drives presently hover at maximum capacities of 3TB in the 3.5” form factor and lesser capacity for drives of a smaller footprint. As traditional drive recording begins to reach the ceiling of magnetic properties, several technologies are on the horizon to provide storage options for the industry.
Current magnetic drives employ Perpendicular Magnetic Recording (PMR), meaning the magnetic bits align perpendicularly to the spinning disk. Since PMR began shipping into commercial applications in 2005, storage densities have increased as much as eight times from the previous standard, longitudinal recording4. As storage demands escalate, disk capacity has grown based on areal density (AD) improvements, but the slowing of AD advances presents a dilemma: PMR is reaching its limit. An interim answer may be Shingle Magnetic Recording (SMR) technology.
Like shingle tiles layered on a roof, SMR writes partially overlapping data tracks in a particular direction, radially to increase a drive's areal density, getting more tracks on disk platter surfaces. Tracks are written with a write head wider than the track pitch, thus improving overwrite and adjacent track interference. SMR may help extend PMR by 20 to 50% over conventional recording, but SMR has its own issues associated with emulating the host random read/write commands as random read/sequential write operations on the media. An ideal fit for selective functions such as archive data storage, SMR will likely remain as a technology option in combination with other enablers on a forward basis.
In HDD evolution, moving from PMR technology and the present Exchange Coupled Composite (ECC) media to Energy Assisted Magnetic Recording (EAMR) may be the next logical step. EAMR technologies apply either a high-frequency magnetic field (microwaves) or heat (via an integrated laser) to a microscopic region on the recording media to facilitate the process of writing data. Expected to enable the number of bits that can be stored on the disk to up to 5 terabits or more per square inch, EAMR technologies should come to market in the next three to five years.
Another technology under investigation is that of Bit-Patterned Media (BPM) which involves pre-defining the size and position of “islands” (bits) in the recording medium via advanced lithography processes and is expected to increase storage density on HDDs to 10 terabits or more per square inch when used in conjunction with EAMR. Anticipated to come to market in five to seven years, BPM records individual bits on lithographed islands of strongly-coupled magnetic material which retain each bit’s magnetic charge, thereby allowing the bits to be far smaller than would otherwise be possible with continuous media. Speed-bumps to BPM’s implementation are significant cost and fabrication concerns.
As different storage methods come under review, what the hard drives of tomorrow will look like remains to be seen. Many variables and the possibility of new discoveries make concrete predictions impossible as vendors continue to explore and invest in a variety of technologies.
Offering the greatest value in computer storage for more than half a century, the hard drive industry will find a way to advance existing and develop new technologies to support the future needs of consumers and businesses alike.
2. IDC Study: “Extracting Value from Chaos,” June 2011.
WD, one of the storage industry's pioneers and long-time leaders, provides products and services for people and organizations that collect, manage and use digital information. The company designs and produces reliable, high-performance hard drives and solid state drives that keep users' data accessible and secure from loss. Its advanced technologies are configured into applications for client and enterprise computing, embedded systems and consumer electronics, as well as its own consumer storage and home entertainment products.
WD was founded in 1970. The company's storage products are marketed to leading OEMs, systems manufacturers, selected resellers and retailers under the Western Digital and WD brand names. Visit the Investor section of the company's website (www.westerndigital.com) to access a variety of financial and investor information.
Western Digital, WD, the WD logo, My Book, and WD TV are registered trademarks in the U.S. and other countries; My Book Live, WD 2go and WD TV Live are trademarks of Western Digital Technologies, Inc. Other marks may be mentioned herein that belong to other companies.. As used for storage capacity, one terabyte (TB) = one trillion bytes. Total accessible capacity varies depending on operating environment.
Financial reporting has long been a vital part of any business, and on-time, accurate results are of the utmost importance. This highly pressurised, deadline driven environment has always been a pain point for financial executives, and it has only become more complicated over the years, with annual reports and board books compounded by statutory and regulatory reports, and recently the need to conduct sustainability and Corporate Social Responsibility (CSR) reporting as well.
This 'last mile' of reporting has traditionally been dominated by manual processes, tight deadlines and stressful conditions, and has always been time consuming and inefficient. Reporting needs to integrate data from multiple sources, both structured and unstructured, from across the enterprise. This requires collaboration from a variety of contributors, who have different areas of responsibility, and access to different data. This process also requires reports to be built, edited, reviewed, approved and published, all using different manual processes.
The manual process and the variety of different parties involved, along with the number of iterations before final sign off such as Word and Excel files being shared via e-mail, last minute changes, lack of workflow control, lack of access control and lack of audit trail have all led to multiple challenges including inaccuracies as well as legal issues as there is no evidence of compliance.
Some of the problems that are experienced across all the different sectors include errors in data collection from multiple locations, a lack of accuracy resulting from re-keying similar information multiple reports, low productivity due to linear workflow with multiple bottlenecks, limited version control and integrity issues. These issues all present risk elements including error, fraud, incorrect interpretation, lack of enterprise interoperability, lack of accuracy and failure to comply with regulation, with the potential to cause financial loss, legal challenges, loss of stock value, loss of reputation, fines and penalties.
Added to this, the proliferation of eXtensible Business Reporting Language (XBRL), which aims to provide a global standard for reporting, is fast gaining ground in South Africa. All companies listed on the Johannesburg Stock Exchange (JSE) that are dual listed on an international exchange are expected to begin using XBRL already, and this requirement is only set to expand. And existing inefficiencies are only compounded by the need to start all over again from scratch for the next period.
Technology provides the means to not only improve efficiency and shorten reporting cycles, but also reduce risk and enable the vast data gained from reporting to be used for analysis, allowing for better future business operations.
Automated last mile reporting provides a single, secure platform to automate and enhance controls over all management, statutory and regulatory reports.
This helps organisations to resolve the difficulties they face when it comes to preparing and filing external as well as internal financial documents, by eliminating manual processes and supporting collaboration, validation, access, workflow and version control as well as meeting XBRL mandates from stock exchanges such as the JSE and regulators.
By aggregating and consolidating financial and non-financial data automatically and using intelligent software with integrated XBRL functionality, organisations gain access to a collaborative solution that delivers a single version of the truth, with a complete audit trail, version and workflow control, integrated business rules and compliance checking as well as editable variables in the text, which can deliver reports automatically in a variety of outputs, from Microsoft Word and PowerPoint to PDF and even formats specifically for the various stock exchange requirements.
This reduces risk across multiple areas: risk of error is reduced, through a single version of the truth resulting in finance managers responsible for accuracy spending less time reviewing or handling last minute changes and spending more time analysing the results; the risk of late filing is reduced, since the reports are generated automatically from the aggregated data and can therefore be reviewed far sooner; the risk of insider leaks is minimised because access can be strictly controlled; and the risk of non-compliance can be addressed by building in compliance checks and audit trails, which increase confidence in the final report.
Research shows that more efficient report building via automation reduces overtime cost and even printing cost with up to 50%. The expected return on investment is between 3months to 1 year.
Integrated XBRL reporting has several advantages, allowing for the tagging of all numeric and textual data through standard XBRL tagging, reading and viewing, along with secure taxonomy storage, controlled user access and workflow, the ability to track taxonomy changes and create an audit trail, automatic XBRL update which enables tagged data to be automatically amended, and the ability to tag data once, which will then be automatically flowed to future periods, eliminating the need to start from scratch on each new report.
Automated last mile reporting prevents late submission and material errors in regulatory and statutory filings, prevents insider leaks and enhances weak internal controls and financial governance frameworks, reducing risk.
It eliminates manual data collection and consolidation as well as the need to re-key information, providing a secure environment for review and approval with automated error checking, reducing costs. It eliminates bottlenecks, complex time-consuming processes and manual updating of report data, which reduces the reporting cycle time. Finally, it offers enhanced consistency and integrity of reports, ensuring a single version of the truth and allowing for optimised analysis through intelligent, secure collaboration, improving business insight.
In addition to improving efficiency and reducing risk, automated last mile reporting ensures that companies will comply with some recommendations made by the new King III corporate governance code. One such example in the code is that the audit committee should consider the use of technology to improve efficiency and audit coverage and that the CIO and ultimately the CFO should understand the impact that ICT has on finance and more importantly on risk.
With the reporting process only becoming more onerous as the years pass, and the need for accurate financial and non-financial data crucial to the success of the modern business, manual reporting has become an increasingly outdated model. By automating this process organisations can not only save time and money, they can improve efficiency, reduce risk and gain access to reports that deliver real business insights that enable better outcomes and smarter decision making ability.
As cloud services have become more mainstream and abundant, and access to such services has become a simple matter of using a standards-compliant web browser, the concept of Bring Your Own Device (BYOD) has begun to emerge within corporate networks. While this practice has in the past been more common in smaller businesses such as consultancies with fewer than 50 employees, advances in the cloud have made BYOD a more feasible concept in larger organisations and enterprises as well.
The BYOD policy allows employees the freedom to pick and choose their own computing equipment to use in a work environment, with the employer co-sponsoring the device or paying a percentage towards it. Thanks to advances in cloud computing, BYOD also enables a non-homogenous IT environment to be created, with a multitude of different brands and devices all sharing the same network environment through a web browser front end system. This means that it is no longer necessary for employees to have specific machines with specific software and specifications in order to access the network, so individual preference in terms of brand and specs now plays a role in corporate IT networks, which has several benefits but also increases the complexity of the IT environment.
Allowing employees more choice gives them a feeling of ownership and control, and also enables them to express their individuality. From an employee perspective, the ability to select a device they would like to use rather than being saddled with a corporate hand-me-down machine increases satisfaction, and the BYOD cloud concept also enables the integration of handheld devices such as smartphones and tablet PCs onto the corporate network without compromising security, mitigating risk while still catering to individual needs and desires.
For the employer, BYOD typically results in employees looking after their devices better, since they have some investment in the technology and will be more inclined to take better care than they would a piece of technology that was simply given to them. It is also in the interest of employees to extend the lifecycle of their technology, since this will mean that they are no longer using a portion of their salary to pay for the device. The cloud driven BYOD concept also enables a truly mobile workforce, since the web browser driven environment can be accessed from anywhere on any authorised login or machine, removing the traditional office boundaries and enabling workers to access the full corporate network resources no matter where they are in the world and enabling employees to work and complete tasks when they choose. These aspects of cloud computing and BYOD particularly appeal to 'Gen Y' employees, who prefer working with technology to solve problems and enjoy self-managed environments.
While there are multiple benefits to a non-homogenous IT environment, the BYOD model also brings with it complications, chief among them vastly increased complexity. In an enterprise where a hundred people each have their own individual device, and there are only five IT technicians tasked with keeping these machines running, there are bound to be problems. The IT support personnel cannot be expected to be thoroughly versed in the maintenance and upkeep of every single type of machine, and each different machine and model will have different drivers and tools that must be kept up to date. Support costs as a result will increase dramatically.
In order to curb the increased complexity of the BYOD environment, corporates are typically taking one of two routes. The first is to offer a limited selection of devices which employees can choose from and then allow access to personal devices such as tablets and smartphones, which is the most practical option in a very large enterprise environment. The second option, which is more appealing for smaller and medium sized organisations, is to place the onus for technical support in the hands of the owner of the device, in other words the employee.
This makes after sales service and support a key consideration for both businesses and employees when selecting their BYOD equipment. Without adequate business oriented support, when problems occur productivity can be severely affected, and delays in repairs and maintenance can end up costing valuable business time and money. Warranties should offer same day or next business day on site support and guaranteed service level agreements to ensure that issues are resolved timeously. Businesses should align themselves with vendors that provide consistently high levels of service with customer references and recommendations to show this, and should encourage employees to select devices from these vendors in order to minimise the potential negative consequences of BYOD models.
Aside from after sales service, another key consideration should be the range of accessories and additional equipment that is available to increase productivity on BYOD. Attachments such as docking stations, additional chargers, extended life battery packs and RAM upgrades can all help to improve productivity and enhance the user experience, and security devices such as Kensington Locks and slots ensure devices cannot easily be stolen, along with privacy filters which are ideal for environments where employees are working with sensitive or confidential information.
The BYOD model is ideally suited to today's employee, who demands greater choice and flexibility in all areas, including working on equipment such as computers and mobile devices. However, while this model offers many benefits both to employees and employers, there are pitfalls that need to be avoided.
Partnering with a reputable vendor that offers a full range of accessories as well as high levels of after sales service and support will enable organisations to take advantage of the benefits of BYOD while avoiding potential problems.
Viruses, malware, spyware and spam are all common terms in today's connected world, and most users who access the Internet using a PC, laptop or notebook would not dream of doing so without some sort of security solution.
Cybercrime has become the world's most profitable underworld 'business' and users are now aware of the need to protect themselves from falling victim to any number of threats.
However, the world has become increasingly mobile over time and technology has evolved so that cellphones have become smartphones, along with tablet PCs and other mobile devices allowing always on connectivity. While such devices have revolutionised the way we work and have grown exponentially in popularity in recent years, this also opens up new avenues for cybercriminals with malicious intent to exploit users.
In the cybercrime industry, as with any profitable business, the smart enterprise goes where the customers are. As more users migrate onto a variety of mobile platforms, from BlackBerry to Android, Windows Mobile, Apple Mobile OS and even tablet devices, cybercrime is increasingly targeting these platforms as methods for the perpetration of a variety of malware, including viruses, spam and spyware, all with the express purpose of making money.
Users are now vulnerable to viruses that can shut down their smartphones; identity theft as a result of spyware and the entire gamut of threats that were once the exclusive realm of computers. As a result, mobile security is becoming increasingly important for users to protect themselves from falling victim to malicious activity.
Aside from the threat of malware, the increasing trend towards mobility along with tools such as push email functionality also means that more and more personal and business information is now stored on highly portable devices. The greater the levels of portability, the greater probability that the device, and the information it holds, may be lost or stolen. Having personal and business critical information falling into the wrong hands is undesirable for obvious reasons, and the ability to remotely wipe the device in the event of it being misplaced is another strong reason as to why mobile security is so important in today's world.
When looking for a mobile security solution to protect devices such as smartphones and tablets from malware and to ensure personal information does not fall into the wrong hands, there are several features to look out for, which take care of security in the event of loss or theft and also help to prevent malware attacks from causing problems on the device and prevents cross-infection of other devices.
Remote locking capability is one crucial feature, which lets you remotely disable a lost or stolen phone to prevent strangers from accessing private information. It also prevents thieves from actually using the device, so that they cannot run up large phone bills at your expense. This, combined with SIM card locking capability, which instantly locks your phone if its SIM card is removed so thieves cannot use it with a different SIM card, ensure that even if your phone is stolen, the thieves will not profit from it as the device is effectively rendered unusable.
Remote wipe is also critical, as this lets you actually erase all of the information contained on your phone if it is lost or stolen, including contacts, text messages, call history, browser history, bookmarks, and any data on the phone's memory card. Mobile threat protection will detect and remove malware such as viruses and other threats without affecting phone performance, and download threat protection will automatically scan all files and app updates for threats.
Aside from these now vital security components, a comprehensive mobile security solution also lets you block calls and text messages from specific phone numbers, perfect for getting rid of annoying spam SMS's and unwanted callers.
Mobile devices are increasingly common, which makes them more and more vulnerable to attack. Protecting all of the personal information stored on your mobile device is of the utmost importance to prevent all too common issues such as identity theft, and mobile security is no longer a nice to have, but a necessity for both consumer and business users in the modern world.
Binary Tree’s E2E Complete 2.0 is the Smart, Fast and Simple Solution for Upgrading to Microsoft Exchange 2010 and Consolidating Exchange Systems
29 November 2011
Binary Tree, a provider of software solutions for migrating to Microsoft Exchange and SharePoint, recently announced the availability of E2E Complete which adds support for cross-forest (inter-organisation) Exchange migrations in addition to its existing capabilities for automating intra-organisation Exchange upgrades.
“E2E Complete now provides customers with a comprehensive Exchange migration solution that they can use to upgrade, consolidate, segregate, and migrate their Exchange Server environments,” states Val Vasquez, E2E Complete Product Manager for Binary Tree. “And since E2E Complete is licensed to individual mailboxes, customers can migrate these mailboxes between on-premises Exchange environments and Microsoft Office 365 as many times as they need to.”
E2E Complete is architected to use the latest Exchange Management Shell
(PowerShell) commands in the background to move mailboxes at speeds an order of magnitude faster than agent-based tools and provides true bi-directional synchronisation between Active Directory (AD) environments as well as public folder migration to support legacy Exchange 2003 environments. E2E Complete provides project managers and administrators with centralised management and reporting, automated user communications, and the ability to automatically build a schedule and forecast for the project based on a small migration sample, taken continuously.
Binary Tree’s E2E Complete software was recently named "the Best Exchange / Unified Communications Product" in the “2011 Best of Connections Awards” by Penton Media’s Windows IT Pro, SQL Server Magazine, DevProConnections and SharePoint Pro.
Says Chris Hathaway, Director at Soarsoft Africa and distributor of Binary Tree solutions, “This award is testimony to Binary Tree’s innovative solutions and pedigree as a dedicated migration company over the last 16 years. Our recent distribution agreement with Binary Tree means that the local market now has access to these solutions and methodologies, that have been used for the largest and most complex migrations in the world, including sites with over 200 000 users.”
About Soarsoft Africa
Soarsoft Africa has specialised in Archive, Migration and Messaging services for over a decade, leveraging the experience gained from some of the largest and most complex implementations around the world to deliver cost effective and successful projects and solutions. Soarsoft Africa remains product independent, but supports and implements what it considers to be trusted and proven solutions to meet specific customer requirements.
Soarsoft Africa continues to evaluate its offerings to ensure they maintain market and technological leadership positions, so that clients are offered the very best advice when assessing products and solutions that will match their requirements.
Soarsoft Africa has offices in Johannesburg and Cape Town South Africa.
About Binary Tree
Binary Tree is a leading provider of software for migrating enterprise messaging users and applications to on-premises and cloud-based versions of the Microsoft platform. Since 1993, Binary Tree and its business partners have helped over 5,000 customers around the world to migrate more than
20,000,000 email users. Binary Tree’s suite of software provides solutions for migrating from Exchange 2003/2007 and Lotus Notes to on-premises and online versions of Exchange and SharePoint. Binary Tree is represented by business partners worldwide who provide specialized services and a proven methodology for guiding customers through complex transitions. Binary Tree is a Microsoft Gold ISV Partner, an IBM Premier Business Partner, and is Microsoft’s preferred vendor for migrating to Microsoft Office 365. Binary Tree is headquartered in the New York metropolitan area with international offices in London, Paris, Stockholm, Singapore, and Sydney. For more information, please visit us online at www.binarytree.com.
Business Intelligence (BI) is touted as the enabler of sound business decisions through the use of accurate reports. However, the high failure rate of BI implementations over the years shows that simply incorporating BI into the organisation and producing reports is not a 'silver bullet'.
Fundamental to the success of a BI project is the quality of the data, and if businesses do not address Data Quality (DQ), it is akin to putting the cart before the horse. DQ tools are therefore an imperative for organisations that are embarking on a BI project as well as those companies that have already implemented a BI solution but are not yielding the expected results.
When an organisation has a BI solution in place, CEOs and decision makers are led to believe they are making decisions based on fact, when often the reports show only a version of the truth that may be inaccurate or have been manipulated. The reality is that quality data is vital in making quality business decisions. It is very well having a highly sophisticated BI solution that generates reams of reports for businesses to base decisions on, but if the data these reports are generated from is poor, the reports will be correspondingly poor and bad decisions are likely to be made.
While the failure of BI may be attributed to a number of factors, research shows that poor data quality is the major contributor towards the high incidence of failed BI implementations. This has led to a growing trend for organisations to begin looking at (DQ) solutions in their own right, rather than simply as part of the overarching BI system or as an add-on solution.
Historically DQ budgets have been spent on manual, unrepeatable processes, which fail to yield on-going improvements and leads to a lack of understanding from business as to why money needs to be spent in this area.
The decisions makers also do not realise that business problems can be a result of data problems, since data is technical and belongs to Information Technology (IT). The relationship between data and business processes has not been clearly understood. However, all business processes rely on data.
Poor data quality is often hidden from the business, since IT will spend vast amounts of time and money producing results that business can make sense of in the form of reports, but the information contained in reports may not bear a complete resemblance to the underlying data. Business assumes that reporting is based on correct, quality data, but this may not always be the case.
The recent partnership between Qlikview, a BI vendor, and Trillium Software, a specialist best-of-breed DQ solutions provider, highlights a growing trend for BI vendors to incorporate comprehensive data quality software as part of a complete BI offering, due to growing customer demand. Decision makers are now realising the connection, and are beginning to understand the need for better quality data in order to drive better decision making.
The implications of poor data quality are highly dependent on the particular business, but since every critical business process relies on data from the supply chain to credit risk to invoicing and so on, the effectiveness of these processes relies on accurate underlying data. For example, if the data received for invoicing is incorrect then incorrect invoices will be issued.
This results in poor collections and has a negative impact on the company's reputation. Even legitimate invoices may be queried or have delayed payment, resulting in cashflow problems and the massive, unnecessary expense of resolving each query. Therefore, poor data quality can introduce a host of problems to any organisation including risk, financial and damage to the organisation's reputation.
When it comes to DQ, there are two paths a business can take. The company can either accept poor quality data, or implement a sustainable, repeatable and auditable solution that ensures data is captured 'right' the first time, managing DQ proactively and preventing quality issues before they become problems. DQ solutions should be selected much like any other business critical tool, based on its ability to solve your specific business problem.
This means that you should evaluate data quality solutions in their own right, just as you would evaluate any other critical infrastructure. You should consider both technical aspects - such as the ability of the solution to integrate to your data sources, and to support both batch and real time data cleansing - and business aspects. Other questions that should be asked
include: does the platform provide a quick start in the form of pre-packaged knowledge of your data? Does the vendor bring data management experience to the table or are they simply dropping a product that is not their core focus?
The importance of DQ quality is a growing trend within the BI space, as highlighted by the fact that pure play DQ vendors such as Trillium have seen impressive growth in Europe and the USA, even in the grips of a massive recession. While South Africa tends to lag behind the technology curve, the
2011 ITWeb BI Survey shows that a quarter of all respondents have cited data quality as a reason for the failure of BI. While only 10% of the respondents stated that a DQ solution was on the cards, we are beginning to see investment into best-of-breed DQ solutions in the country, particularly within financial services, and following international trends, we can expect growth in this space in the next few years.
About Master Data Management
Master Data Management (MDM) provides specialist solutions for data governance, data quality, data integration and MDM. Leveraging the international expertise of its vendors, including Harte-Hanks Trillium, Panviva, Varonis and Expressor Software, MDM has provided solutions for clients in financial services, government, mining and telecommunications.