Monthly Archives: July 2017

Tips for Choosing a Web Designer for Your Business Web Site

How to Choose a Web Design Firm

Simple. You do your homework on them. Then, you start asking questions and taking notes. There are plenty of web designers available. You want to go with the best because, in fact, your web designer is in essence your partner. You want to choose a designer that takes YOUR business seriously.

What questions do you ask?

There are several important questions to ask when choosing a web designer for your business web site.

Creating your web site can be a tricky process. Choosing the best web design firm for your business web site is a very important decision. And if your company is like most small businesses, you probably do not have web design experience. Building your web site will take time and work. And working with a web designer is no easy task. So choose the right web design company from the start and avoid do-over’s, which can be costly and time consuming.

1. What kind of web experience do you have?

For starters, find out what kind of design experience your potential design firm has. Do they have experience with content management systems such as Joomla or Drupal, do they have experience working with “raw” HTML? Has the web design company created web sites similar to yours? Do they have relevant industry experience? If you want to sell products through your web site and accept credit card payments, does the web design company you are considering have experience with ecommerce hosting?

2. Do you have a portfolio that I can review?

An experienced web design company will have a solid portfolio of web sites that they have created for other clients. Ask for links to other site the design company has created and review each one. Do you like what you see? Do the sites have a style that appeals to you?

3. Do you have any references?

In addition to reviewing web sites, ask for customer references. Contact their clients and ask them about their experience with the web design company. Were they happy with the results? Did they get what they paid for? How much did they pay? Would they recommend them? How long did it take? What didn’t they like about the company? How responsive was the company when they had questions?

4. What are your prices?

The most important step in pricing is to make sure the potential design company outline all of the prices associated with the work and puts it all in writing. Never enter into a deal unless all of the costs are well understood up front.

Ask them a bit about how they manage payments. If they respond in a very business-like and professional manner, this is a good sign. If they throw out answers like – “Don’t worry, we’ll manage” or “Whatever you are comfortable with”, don’t be fooled. This is trouble waiting to happen. Get the price in writing before you begin the project.

5. Do you have experience with search engine optimization?

Most small business owners do not have it in their budget to hire a separate marketing firm to work on search engine optimization (SEO), so it imperative that your web designer have experience in SEO. A good designer will know that design and SEO go hand-in-hand. Designing a web site for search engines with “clean” code that utilizes cascading style sheets is essential to getting your content indexed in the leading search engines, such as Google and Bing.

6. Do you have experience with social media marketing?

Many marketing firms do know the first thing about social media marketing. These firms are stuck in the past and are not as effective as they pretend to be. Be sure that you work with a designer that knows how to setup a Facebook fan page for your business and design a customized Twitter profile. This is important because you will want your social media properties to mesh with the design of your web site. The web site and social media pages should complement one-another.

7. What is your process for designing or building a web site?

Make sure you ask your potential web design company about the process that they use? Do they design a web site or do they build a web site? An experienced Internet professional should understand the difference between these two concepts. If they don’t, they’re probably not as experienced as they claim to be. Building a web site is a highly technical process, while designing a web site is a highly creative process. Many advertising firms specialize in web site design which does not necessarily require any web development skills whatsoever. At the same time, many firms design web sites, yet out-source the creative portion of the project. Find out from the beginning what the process if for the firm that you are considering.

8. How long will it take?

Perfectionism can be a huge stumbling block in the fast paced world of the Internet. Some designers are unable to compromise between quality and time to market needs. Test: See how long it takes until you receive a proposal.

9. What type of support is offered after web site launch?

If your design firm does not offer web site maintenance, you might want to continue looking. Most reputable design firms will offer “post-launch” maintenance for companies that do not have an in-house webmaster.

Is Green Web Hosting The Future of The Web?

Is Green Web Hosting The Future of The Web? Count On It. Internet Future: Green Web Hosting

The world wide web runs on electricity, though we don’t usually think about it. But the fact is, without electricity you wouldn’t be reading this. You’d hunker down in front of the fireplace and read the newspaper by the dim light of a lamp fueled by whale oil. Sound like fun?

It wasn’t. It was actually pretty boring compared to today’s instant information available only on the W3. If it happens any where – from Teheran to Terre Haute, it spreads virally across the digital landscape in seconds. That news bit also gobbles up electricity like nobody’s business.

So, is green hosting the wave of the future web? Yes. And if you aren’t riding the crest of that wave, you’ll be left behind in the digital dust. An energy munching web site is sooo “last millennium.”

The Ever Expanding World Wide Web

Some facts about green hosting:

  • There are more than 125 million web sites on line today.
  • Each day, 6,000 new web sites launch, carrying with them the dreams and visions of web-preneurs looking to become the next Amazon.
  • Each web site sits on a server that requires electricity to run it and cool it. That requires a lot of electricity.
  • The electricity bill for a successful web host would make a grown man cry. It’s HUGE. Many web hosts pay thousands of dollars monthly to keep their servers juiced.
  • The consumption of electricity used by web hosts doubled in just five years.
  • With the advent of Web 2.0 features like videos, VoIP, streaming TV and other “must-haves,” the web will only expand the demand for more and more electricity.
  • Experts suggest that energy consumption by web hosts will continue to double every five years and studies show these web pros are actually being conservative. Some “green” bloggers suggest the amount of electricity consumed by web hosts will double every 30 months.
  • Energy consumption to power the W3 grows exponentially, doubling then quadrupling and so on. The demand for power from web hosts will increase at a phenomenal rate.
  • As the world wide web grows in both size and features, web site owners will require increasing amounts of bandwidth to avoid long download times.

What Are Web Hosts and Why Do They Gobble Up Energy?

Conduct a Google search of web hosts and see what pops up.

You’ll get 149,000,000 search results for web hosts. Now, not all of these SERP links are for actual web hosts. Some are for reviewing sites (that use web hosts), blogs about web hosts (that use a web host to get their blogs out to the masses) and even SERPs links to sellers, resellers and re-re-sellers of hosting services.

Hosting is a commodity on the web. You or I can buy space from a web host and open up our own hosting company. The mother host provides all the tools and support you need to build your own hosting company in the basement office. (The one that floods occasionally. Ooops.)

A web host can be a kid down the street or it can be a huge, physical plant with chipheads tending to racks of servers, customer support taking calls from subscribers and office people tending to routine business matters – like paying the electric company.

What’s a server? Well, in simplest terms, a server is not much more than a humongous hard drive in a box. Your web site (or future web site) resides on one of these server hard drives, along with hundreds of other web sites. Today, server disk space is measured in terabytes.

What’s a terabyte? A measurement of bits and bytes on steroids. A terabyte is the equivalent of 1,000 gigabytes. More dramatically, a terabyte equals 1,000,000,000,000 bytes or 10 to the 12th power of bytes.

That brand new computer you just bought MIGHT have a 500 gigabyte hard drive. A server has dozens of terabytes of storage. It also has tons of RAM (128 gigabytes of RAM isn’t unusual) so that the server can deliver the bandwidth required for fast downloads of all the sites stored on that server.

In other words, web hosting companies consume a lot of electricity with rack upon rack of servers all sucking up electricity from the grid.

Facebook, alone, employs 30,000 servers as of October, 2009 and that number grows daily as more and more of us connect through this social media site. Amazon employs thousands of servers. Microsoft, Verizon and all of your other favorite sites employ thousands of servers strung together in arrays,

In other words, there are millions of servers storing terabytes of information available to you on the W3. Yep, even your little blog gobbles up electricity.

Green Hosting

So, even though we may not think about energy consumption when we stake a claim to some digital real estate, build and launch web site, we are adding to the demand for more and more electricity.

Now, along comes green hosting – hosting companies that employ “green” technology to lower the demand for electricity generated by coal- and gas-fired electricity generation plants. These businesses recognize that green hosting is inevitable as energy costs rise and we continue to pump tons of air-borne pollutants into the atmosphere every day.

Things ain’t going to get better, folks, unless our corporate culture does an about face and stops drawing down available energy. Major cities, like Los Angeles and Phoenix, already experience rolling blackouts as parts of the energy grid are shut down for a while. New York City broadcasts “please turn off your air conditioners” on the hottest summer days and brown-outs are almost routine.

Green hosting is leading the way in how U.S. businesses conduct on-line business by employing green sources of energy to power their servers.

A green host doesn’t add to the demand for more electricity from traditional sources. Instead, these far-sighted companies employ new technology – solar power, wind power, deep core earth energy, hydro-electric (where available) and bio-fuels that can be regenerated with another harvest of corn.

But there’s a lot more to web hosting than just storing web sites on gigantic hard drives. It’s not just about ROM. It’s also about RAM, which translates into the speed at which your web site interacts with site visitors. It better be fast. Studies reveal that 90% of us will sit through a 10-second download while only 10% will sit through a 30-second download. We’ve become that impatient.

From a site owner’s perspective, that stat translates into a loss of 80% of your prospects in that 20-second download window. So you want more RAM, more bandwidth and unfettered access to the server’s CPU and other shared assets – server parts you share with other sites.

Hosts employ the latest in fiber optic technology, they boost RAM and deliver quick downloads even quicker. Green web hosts do this without further straining the energy system that we rely on to log on, watch TV and cook dinner by using non-traditional resources to power up your web site whenever a site visitor stops by.

Machine-Readable, Structured Data With Meaningful Annotations

Until recently, software agents could not handle many kinds of information that could have been associated with files. Although file structure and extensions provided some information about files, much information could not be expressed. For example, a file with a.jpg extension has always represented a JPEG image but provided no information about the shutter speed, exposure program, F-stop, aperture, ISO speed rating, or focal length until the introduction of metadata formats such as Exif and XMP. However, sharing metadata stored in binary files is still not the most efficient way to share metadata, especially if it is much more generic. In the digital era, electronic files are being sold (e-books, MP3 files, and so on) that might be retrieved or played on many types of devices. A variety of metadata technologies can be used to express arbitrary information and represent any kind of knowledge associated with electronic documents in a machine-readable format. Machine-readable data (automated data) is data stored in a machine-readable format, making it possible for automated software agents to access and process it without human intervention. To browsers, web documents consisted of human-readable data only. In fact, information was confused with the containers that contained them. In contrast to the conventional Web (the “Web of documents”), the Semantic Web is the “Web of data.” The Semantic Web provides machine-processable data, making it possible for software agents to “understand” the meaning of information (in other words, semantics) presented by web documents. This feature can be used for a variety of services, such as museums, community sites, or podcasting.

Note that the word semantic is used on the Web in other contexts as well. For example, HTML5 supports semantic (in other words, meaningful) structuring elements, but this expression refers to the “meaning” of elements. In this context, the word semantic contrasts the “meaning” of elements, such as that of section (a thematic grouping), with the generic elements of older HTML versions, such as the “meaningless” div. The semantics of markup elements should not be confused with the semantics (in other words, machine-processability) of metadata annotations and web ontologies used on the Semantic Web. The latter can provide far more sophisticated data than the meaning of a markup element.

Conventional web documents can be extended with additional data that add meaning to them rather than structure alone. Semantic Web is a new approach that is going to change the world of the Web. Surprisingly, as early as 2001, Tim Berners-Lee described the reason for the existence of the Semantic Web. On the Semantic Web, data can be retrieved from seemingly unrelated fields automatically in order to combine them, find relations, and make discoveries. The Semantic Web should be considered an extension of the conventional Web.

Two terms are frequently associated with the Semantic Web, although neither of them has a clear definition: Web 2.0 and Web 3.0. Web 2.0 is an umbrella term used for a collection of technologies that form the second generation of the Web, such as Extensible Markup Language (XML), Asynchronous JavaScript and XML (Ajax), Really Simple Syndication (RSS), and Session Initiation Protocol (SIP). They are the underlying technologies and standards behind instant messaging, Voice over IP, wikis, blogs, forums, and syndication. The next generation of web services is more and more frequently denoted as Web 3.0, which is an umbrella term usually referring to customization and semantic contents and more sophisticated web applications toward Artificial Intelligence (AI), including computer-generated contents.

The Semantic Web is a major aspect of Web 2.0 and Web 3.0. Web 3.0 can be considered a superset of the Semantic Web that features social connections and personalization. Several technologies contribute to the sharing of such information instead of web pages alone, and the number of Semantic Web applications is constantly increasing.

On the Semantic Web, there is a variety of structured data, usually expressed in, or based on, the Resource Description Framework (RDF). Similar to conventional conceptual modeling approaches, such as class diagrams and entity relationships, the RDF data model is based on statements that describe and feature resources, especially web resources, in the form of subject-predicate-object expressions. The subject corresponds to the resource. The predicate expresses a relationship between the subject and the object. Such expressions are called triples. For example, the statement “The sky is blue” can be expressed in an RDF triple as follows:

  • Subject: “The sky”
  • Predicate: “is”
  • Object: “blue”

RDF is an abstract model that has several serialization formats. Consequently, the syntax of the triple varies from format to format. Keep in mind that RDF is a concept, not a syntax.

The authors of the “conventional” Web usually publish unstructured data, because they do not know about the power of structured data, find RDF too complex, or do not know how to create and publish RDF in any of its serialization formats. The following are solutions to the problem that add structured data to conventional (X)HTML markup, which can be extracted by appropriate software and converted to RDF:

  • Microformats, which reuse markup attributes
  • Microdata, which extends HTML5 markup with structured metadata
  • RDFa (RDF in attributes), which expresses RDF in markup attributes that are not part of (X)HTML vocabularies

All data controlled by conventional web applications are kept by the applications themselves, making a significant share of data and their relationships virtually unavailable for automated processing. Semantic Web applications, on the other hand, can access this data through the general web architecture and transfer structured data between applications and web sites. Semantic web technologies can be widely applied in a variety of areas, such as web search, data integration, resource discovery and classification, cataloging, intelligent software agents, content rating, and intellectual property right descriptions. A much wider range of tasks can be performed on semantic web pages than on conventional ones; for example, relationships between data and even sentences can be automatically processed. Additionally, the efficiency is much higher. For example, a very promising approach provides direct mapping of relational data to RDF, making it possible to share data of relational databases on the Semantic Web. Since relational databases are extremely popular in computing, databases that have been stored on local hard drives up to now can be shared on the Semantic Web. Commercial RDF database software packages are already available on the market (5Store, AllegroGraph, BigData, Oracle, OWLIM, Talis Platform, Virtuoso, and so on). Semantic tools can also be used in a variety of other areas, including business process modeling or diagnostic applications.

Why Web Services?

Overview
Component-based programming has become more popular than ever. Hardly an application is built today that does not involve leveraging components in some form, usually from different vendors. As applications have grown more sophisticated, the need to leverage components distributed on remote machines has also grown.

An example of a component-based application is an end-to-end e-commerce solution. An e-commerce application residing on a Web farm needs to submit orders to a back-end Enterprise Resource Planning (ERP) application. In many cases, the ERP application resides on different hardware and might run on a different operating system.

The Microsoft Distributed Component Object Model (DCOM), a distributed object infrastructure that allows an application to invoke Component Object Model (COM) components installed on another server, has been ported to a number of non-Windows platforms. But DCOM has never gained wide acceptance on these platforms, so it is rarely used to facilitate communication between Windows and non-Windows computers. ERP software vendors often create components for the Windows platform that communicate with the back-end system via a proprietary protocol.

Some services leveraged by an e-commerce application might not reside within the datacenter at all. For example, if the e-commerce application accepts credit card payment for goods purchased by the customer, it must elicit the services of the merchant bank to process the customer’s credit card information. But for all practical purposes, DCOM and related technologies such as CORBA and Java RMI are limited to applications and components installed within the corporate datacenter. Two primary reasons for this are that by default these technologies leverage proprietary protocols and these protocols are inherently connection oriented.

Clients communicating with the server over the Internet face numerous potential barriers to communicating with the server. Security-conscious network administrators around the world have implemented corporate routers and firewalls to disallow practically every type of communication over the Internet. It often takes an act of God to get a network administrator to open ports beyond the bare minimum.

If you’re lucky enough to get a network administrator to open up the appropriate ports to support your service, chances are your clients will not be as fortunate. As a result, proprietary protocols such those used by DCOM, CORBA, and Java RMI are not practical for Internet scenarios.

The other problem, as I said, with these technologies is that they are inherently connection oriented and therefore cannot handle network interruptions gracefully. Because the Internet is not under your direct control, you cannot make any assumptions about the quality or reliability of the connection. If a network interruption occurs, the next call the client makes to the server might fail.

The connection-oriented nature of these technologies also makes it challenging to build the load-balanced infrastructures necessary to achieve high scalability. Once the connection between the client and the server is severed, you cannot simply route the next request to another server.

Developers have tried to overcome these limitations by leveraging a model called stateless programming, but they have had limited success because the technologies are fairly heavy and make it expensive to reestablish a connection with a remote object.

Because the processing of a customer’s credit card is accomplished by a remote server on the Internet, DCOM is not ideal for facilitating communication between the e-commerce client and the credit card processing server. As in an ERP solution, a third-party component is often installed within the client’s datacenter (in this case, by the credit card processing solution provider). This component serves as little more than a proxy that facilitates communication between the e-commerce software and the merchant bank via a proprietary protocol.

Do you see a pattern here? Because of the limitations of existing technologies in facilitating communication between computer systems, software vendors have often resorted to building their own infrastructure. This means resources that could have been used to add improved functionality to the ERP system or the credit card processing system have instead been devoted to writing proprietary network protocols.

In an effort to better support such Internet scenarios, Microsoft initially adopted the strategy of augmenting its existing technologies, including COM Internet Services (CIS), which allows you to establish a DCOM connection between the client and the remote component over port 80. For various reasons, CIS was not widely accepted.