Hosting Article

ByAlexia Pamelov

How To Synchronize Node.JS Install Version with Visual Studio 2015

CheapWindowsHosting.com | Best and cheap Node.JS Hosting.  In addition, Visual Studio 2015’s installer has an option to install Node.JS as part of its regular install in order to support the Gulp and Grunt task runners that are built in.  However I ran into an issue today in which I updated Node.JS outside of Visual Studio, but since VS uses its own install that is separate from any outside installation, you can potentially run into a node_modules package dependency issue where one version of npm installs a package (which makes it rely on that version of Node/npm), and then you can’t run commands in the other version (they break).  Specifically, I had an issue with node-sass and windows bindings.  The solution was to point Visual Studio to the version of Node.JS that I had already set up externally to Visual Studio.  Here’s how to synchronize them:

  1. First, find the Node.js installation you already have and use at the command line.  By default, Node.js 0.12.7 installs to “C:\Program Files\nodejs”, as an FYI.
  2. Once you’ve got that all copied out to your clipboard, got to Tools > Options in Visual Studio 2015.
  3. In this dialog, go to Projects and Solutions > External Web Tools to open the dialog that manages all of the 3rd party tools used within VS.  This is where Node.js is pointed to.
  4. Add an entry at the top to the path to the node.js directory to force Visual Studio to use that version instead.

capture

And that’s it!  Now you’re all synced up!  Having two separate installs is really confusing.  If you’re starting out with JUST the VS Node.js version, you’ll eventually come to a point where you may update node.js by installing it outside VS, causing it to get out of sync anyway.  If you’re a veteran Node.js person, then you’re already using Node outside VS and will need to do this anyway. It seems like there should be better documentation or indicators to show what version VS is using so this is more apparent.

Hope that helped.  Did this fix it for you?  Do you have a better way of keeping this in sync or a plugin/tool to help out?  Let us know in the comments!

Best OFFER Cheap Node.JS Hosting ! Click Here

Save

ByAlexia Pamelov

Cheap Windows ASP.NET Hosting Tutorial – FluentFilters for ASP.NET Core

CheapWindowsHosting.com | Best and cheap ASP.NET hosting. In this post we will a library for ASP.NET Core, that will add support of criteria for global action filters. ASP.NET Core have ability to register filters globally. It’s works great, but sometimes it would be nice to specify conditions for filter execution and FluentFlters will help with this task.

asp

Project on GitHub: https://github.com/banguit/fluentfilters

What should you do ?

First Install package

For ASP.NET Core Web Application you should use FluentFilter version 0.3.* and higher. Currently latest version 0.3.0-beta.
To install the latest package you can use Nuget Package Manager in Visual Studio or specify dependency in project.json file as shown below and call for package restore.

{
    //...
    "dependencies": {
        //...
        "fluentfilters": "0.3.0-beta"
    },
    //...
}

Configuration:

After installing the package to your ASP.NET Core Web Application you should replace default filter provider by custom from library. Your Startup class should looks like shown below:

// Startup.cs
using FluentFilters;  
using FluentFilters.Criteria;

namespace DotNetCoreWebApp  
{
  public class Startup
  {
    //...
    public void ConfigureServices(IServiceCollection services)
    {
      //...
      services.AddMvc(option =>
      {
        option.Filters.Add(new AddHeaderAttribute("Hello", "World"), c =>
        {
          // Example of using predefined FluentFilters criteria
          c.Require(new ActionFilterCriteria("About"))
            .Or(new ControllerFilterCriteria("Account"))
            .And(new ActionFilterCriteria("Login"));
        });
      });

      // Replace default filter provider by custom from FluentFilters library
      Microsoft.Extensions.DependencyInjection.Extensions.ServiceCollectionExtensions.Replace(services, ServiceDescriptor.Singleton<IFilterProvider, FluentFilterFilterProvider>());
      //...
    }
    //...
  }
}

Registering filters

To register filters with criteria, you need do it in usual way but calling extended methods Add or AddService. Below you can see signature of these methods.

// Register filter by instance
void Add(this FilterCollection collection, IFilterMetadata filter, Action<IFilterCriteriaBuilder> criteria);

// Register filter by type
IFilterMetadata Add(this FilterCollection collection, Type filterType, Action<IFilterCriteriaBuilder> criteria)  
IFilterMetadata Add(this FilterCollection collection, Type filterType, int order, Action<IFilterCriteriaBuilder> criteria)  
IFilterMetadata AddService(this FilterCollection collection, Type filterType, Action<IFilterCriteriaBuilder> criteria)  
IFilterMetadata AddService(this FilterCollection collection, Type filterType, int order, Action<IFilterCriteriaBuilder> criteria)

Specify conditions

To specify the conditions, you should set the chain of criteria for the filter at registration. Using criteria, you can set whether to execute a filter or not. The library already provides three criteria for use:

  • ActionFilterCriteria – filter by specified action
  • AreaFilterCriteria – filter by specified area
  • ControllerFilterCriteria – filter by specified controller

For one filter, you can only specify two chains of criteria. These are the chains of criteria that are required and which should be excluded.

option.Filters.Add(typeof(CheckAuthenticationAttribute), c =>  
{
    // Execute if current area "Blog"
    c.Require(new AreaFilterCriteria("Blog"));
    // But ignore if current controller "Account"
    c.Exclude(new ControllerFilterCriteria("Account"));
});

Chains of criteria are constructed by using the methods And(IFilterCriteria criteria) and Or(IFilterCriteria criteria), which work as conditional logical operators && and ||.

option.Filters.Add(typeof(DisplayTopBannerFilterAttribute), c =>  
{
    c.Require(new IsFreeAccountFilterCriteria())
        .Or(new AreaFilterCriteria("Blog"))
        .Or(new AreaFilterCriteria("Forum"))
            .And(new IsMemberFilterCriteria());

    c.Exclude(new AreaFilterCriteria("Administrator"))
        .Or(new ControllerFilterCriteria("Account"))
            .And(new ActionFilterCriteria("LogOn"));
});

If using the C# language, then the code above can be understood as (like pseudocode):

if( IsFreeAccountFilterCriteria() || area == "Blog" ||  
    (area == "Forum" && IsMemberFilterCriteria()) ) 
{
    if(area != "Administrator")
    {
        DisplayTopBannerFilter();
    }
    else if(controller != "Account" && action != "LogOn")
    {
        DisplayTopBannerFilter();
    }
}

Implementation of custom criteria

To create a custom criterion you should inherit your class from the FluentFilters.IFilterCriteria interface and implement only one method Match with logic to making decision about filter execution. As example, look to the source code for ActionFilterCriteria:

public class ActionFilterCriteria : IFilterCriteria  
{
    #region Fields

    private readonly string _actionName;

    #endregion

    #region Constructor

    /// <summary>
    /// Filter by specified action
    /// </summary>
    /// <param name="actionName">Name of the action</param>
    public ActionFilterCriteria(string actionName)
    {
        _actionName = actionName;
    }

    #endregion

    #region Implementation of IActionFilterCriteria

    public bool Match(FilterProviderContext context)
    {
        return string.Equals(_actionName, context.ActionContext.RouteData.GetRequiredString("action"), StringComparison.OrdinalIgnoreCase);
    }

    #endregion
}

 

ByAlexia Pamelov

Cheap Windows Hosting – How To Enable ASP.NET 4.5 On Windows 10 or Windows 8/8.1

CheapWindowsHosting.com | Best and Cheap windows Hosting. By default IIS and ASP.NET aren’t configured as part of a Windows setup (for obvious reasons) so developers are used to having to register IIS manually before being able to run and develop ASP.NET web sites on their desktops.

429944-best-hosted-platforms-for-small-business

This no longer works and requires a different command. Depending on what you already have enabled this may work

dism /online /enable-feature /featurename:IIS-ASPNET45

If you haven’t enabled anything related to IIS yet you can do that at the same time with:

dism /online /enable-feature /all /featurename:IIS-ASPNET45

However!  That might not appear to solve the problem even when it has!  A post from Microsoft makes a bug apparent:

After the installation of the Microsoft .NET Framework 4.6, users may experience the following dialog box displayed in Microsoft Visual Studio when either creating new Web Site or Windows Azure project or when opening existing projects.

Configuring Web http://localhost:64886/ for ASP.NET 4.5 failed. You must manually configure this site for ASP.NET 4.5 in order for the site to run correctly. ASP.NET 4.0 has not been registered on the Web server. You need to manually configure your Web server for ASP.NET 4.0 in order for your site to run correctly.

NOTE: Microsoft .NET Framework 4.6 may also be referred to as Microsoft .NET Framework 4.5.3

This issue may impact the following Microsoft Visual Studio versions: Visual Studio 2013, Visual Studio 2012, Visual Studio 2010 SP1

Workaround:

Select “OK” when the dialog is presented. This dialog box is benign and there will be no impact to the project once the dialog box is cleared. This dialog will continue to be displayed when Web Site Project or Windows Azure Projects are created or opened until the fix has been installed on the machine.

Resolution:

Microsoft has published a fix for all impacted versions of Microsoft Visual Studio.

Visual Studio 2013 –

  • Download Visual Studio 2013 Update 4
  • For more information on the Visual Studio 2013 Update 4, please refer to: Visual Studio 2013 Update 4 KB Article

Visual Studio 2012

  • An update to address this issue for Microsoft Visual Studio 2012 has been published: KB3002339
  • To install this update directly from the Microsoft Download Center, here

Visual Studio 2010 SP1

  • An update to address this issue for Microsoft Visual Studio 2010 SP1 has been published: KB3002340
  • This update is available from Windows Update
ByAlexia Pamelov

Cheap Cloud Hosting Server

CheapWindowsHosting.com | Best and cheap cloud hosting server plan. In some respects cloud servers work in the same way as physical servers but the functions they provide can be very different. When opting for cloud hosting, clients are renting virtual server space rather than renting or purchasing physical servers. They are often paid for by the hour depending on the capacity required at any particular time.

windows-cloud-server

Traditionally there are two main options for hosting: shared hosting and dedicated hosting. Shared hosting is the cheaper option whereby servers are shared between the hosting provider’s clients. One client’s website will be hosted on the same server as websites belonging to other clients. This has several disadvantages including the fact that the setup is inflexible and cannot cope with a large amount of traffic. Dedicated hosting is a much more advanced form of hosting, whereby clients purchase whole physical servers. This means that the entire server is dedicated to them with no other clients sharing it. In some instances the client may utilise multiple servers which are all dedicated to their use. Dedicated servers allow for full control over hosting. The downside is that the required capacity needs to be predicted, with enough resource and processing power to cope with expected traffic levels. If this is underestimated then it can lead to a lack of necessary resource during busy periods, while overestimating it will mean paying for unnecessary capacity.

Below are the key benefits of cloud servers:

  • Flexibility and scalability; extra resource can be accessed as and when required
  • Cost-effectiveness; whilst being available when needed, clients only pay for what they are using at a particular time
  • Ease of set up; Cloud servers do not require much initial setup
  • Reliability; due to the number of available servers, if there are problems with some, the resource will be shifted so that clients are unaffected. 

What’s Cloud Hosting ?

Cloud hosting services provide hosting for websites on virtual servers which pull their computing resource from extensive underlying networks of physical web servers. It follows the utility model of computing in that it is available as a service rather than a product and is therefore comparable with traditional utilities such as electricity and gas. Broadly speaking the client can tap into their service as much as they need, depending on the demands of their website, and they will only pay for what they use.

It exists as an alternative to hosting websites on single servers (either dedicated or shared servers) and can be considered as an extension of the concept of clustered hosting where websites are hosted on multiple servers. With cloud hosting however, the network of servers that are used is vast and often pulled from different data centres in different locations. 

Practical examples of cloud hosting can fall under both the Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) classifications. Under IaaS offerings the client is simply provided with the virtualised hardware resource on which they can install their own choice of software environment before building their web application. On a PaaS service, however, the client is also provided with this software environment, for example, as a solution stack (operating system, database support, web server software, and programming support), on which they can go straight to installing and developing their web application. Businesses with complex IT infrastructures and experienced IT professionals may wish to opt for the more customisable IaaS model but others may prefer the ease of a PaaS option.

A development of the concept of cloud hosting for enterprise customers is the Virtual Data Centre (VDC). This employs a virtualised network of servers in the cloud which can be used to host all of a business’s IT operations including its websites.

The more obvious examples of cloud hosting involve the use of public cloud models – that is hosting on virtual servers which pull resource from the same pool as other publicly available virtual servers and use the same public networks to transmit the data; data which is physically stored on the underlying shared servers which form the cloud resource. These public clouds will include some security measures to ensure that data is kept private and would suffice for most website installations. However, where security and privacy is more of a concern, businesses can turn towards cloud hosting in private clouds as an alternative – that is clouds which use ring-fenced resources (servers, networks etc), whether located on site or with the cloud provider.

A typical cloud hosting offering can deliver the following features and benefits:

  • Reliability; rather than being hosted on one single instance of a physical server the website is hosted on a virtual partition which draws its resources, such as disk space, from an extensive network of underlying physical servers. If one server goes offline, it dilutes the level of resource available to the cloud a little but will have no effect on the availability of the website whose virtual server will continue to pull resource from the remaining network of servers. Some cloud platforms could even survive an entire data centre going offline as the pooled cloud resource is drawn from multiple data centres in different locations to spread the risk.
  • Physical Security; the underlying physical servers are still housed within data centers and so benefit from the security measures that those facilities implement to prevent people accessing or disrupting them on-site
  • Scalability and Flexibility; resource is available in real time on demand and not limited to the physical constraints/capacity of one server. If a client’s site demands extra resource from its hosting platform due to a spike in visitor traffic or the implementation of new functionality, the resource is accessed seamlessly. Even when using a private cloud model the service can often be allowed to ‘burst’ to access resources from the public cloud for non-sensitive processing if there are surges in activity on the site.
  • Utility style costing; the client only pays for what they actually use. The resource is available for spikes in demand but there is no wasted capacity remaining unused when demand is lower.
  • Responsive load balancing; load balancing is software based and therefore can be instantly scalable to respond to changing demands

Cheap Cloud Hosting Server Recommendation

HFL

HostForLIFE.eu is one of the most famous and best cloud hosting server out there to start a blog without spending a single extra penny. They have more than 2 million domains hosted on their servers. HostForLIFE.eu offers unlimited space, unlimited bandwidth. HostForLIFE claims 99.99% uptime with 24/7 technical support and 30 days back guarantee

ByAlexia Pamelov

Joomla Hosting with Docker and Memcached

CheapWindowsHosting.com | In this post I’ll show you how you can quickly and easily setup a fast Joomla! site running in Docker, and using Memcached for caching. We’ll also be using Docker Compose to make managing the relationships between containers easier. Docker Compose makes the task of managing multiple Docker containers much easier than doing them individually. It also makes it easier if you want to develop this site on your local environment and later push it to a remote server with docker — although that’s a topic for another tutorial.

docker

Install Docker and Docker Compose

If you haven’t installed Docker on your machine yet, follow these instructions.  The instructions for Ubuntu 14.04 (LTS) are below. Be sure to run them as root! If you already have Docker skip on to installing Docker Compose if you don’t already have it.

Install Docker on Ubuntu 14.04 LTS
apt-get update
apt-get install apt-transport-https ca-certificates
apt-key adv --keyserver hkp://p80.pool.sks-keyservers.net:80 --recv-keys 58118E89F3A912897C070ADBF76221572C52609D
echo "deb https://apt.dockerproject.org/repo ubuntu-trusty main" >> /etc/apt/sources.list.d/docker.list
apt-get update
apt-get install linux-image-extra-$(uname -r) apparmor docker-engine
service docker start

 Next install docker compose:

curl -L https://github.com/docker/compose/releases/download/1.6.2/docker-compose-`uname -s`-`uname -m` >> /usr/local/bin/docker-compose
chmod +x /usr/local/bin/docker-compose
docker-compose -v

You should see docker-compose output its version information. Something like: docker-compose version 1.6.2, build 4d72027.

Prepare Joomla and Docker files

Next up we need to create the docker compose files and get the Joomla files ready to use:

Create Project Directory

mkdir -p ./mysite/website_source ./mysite/mysql
cd mysite

Create a Dockerfile

Next create a Dockerfile (e.g. vim Dockerfile) that tells Docker to base your webserver code off of a PHP5 image, and include your source code.

Dockerfile
FROM php:5-fpm
 
# Install dependencies
RUN apt-get update -y && apt-get install -y php5-json php5-xmlrpc libxml2-dev php5-common zlib1g-dev libmemcached-dev
 
# Install mysqli extension
RUN docker-php-ext-install mysqli
 
# Install memcache extension
RUN pecl install memcache && docker-php-ext-enable memcache
 
# Add sources to image
ADD ./website_source /site

Create the Docker Compose config file

And then create the docker-compose.yml file (e.g. vim docker-compose.yml). The docker-compose file lays out how your website is structured. It says that your site is composed of three services: web which is an Apache/PHP image with your source code baked in, db which is the MySQL database, and cache which is the memcached image. It also tells docker how to connect these docker containers so that they can communicate with each other. Lastly, it tells docker to bind port 80 to the web container. Lastly, the mysql container will mount the mysql directory on the host and place the database files there. This way if the container is removed or anything you don’t lose your database.

docker-compose.yml
version: '2'
services:
  web:
    build: .
    command: php -S 0.0.0.0:80 -t /site
    ports:
      - "80:80"
    depends_on:
      - db
      - cache
    volumes:
      - ./website_source:/site
  db:
    image: orchardup/mysql
    environment:
      MYSQL_DATABASE: joomla
      MYSQL_ROOT_PASSWORD: my_secret_pass
    volumes:
      - ./mysql:/var/lib/mysql
  cache:
    image: memcached

 

By sure to replace my_secret_pass with a secure password for the mysql user!

Get Joomla! Sources

Now that you have a Dockerfile and docker-compose.yml you just need to get the sources for Joomla and install them:

Download and Install Joomla!

wget https://github.com/joomla/joomla-cms/releases/download/3.4.8/Joomla_3.4.8-Stable-Full_Package.zip
unzip Joomla*.zip -d ./website_source
mv ./website_source/htaccess.txt ./website_source/.htaccess
mv ./website_source/robots.txt.dist ./website_source/robots.txt

Note that if you don’t have unzip installed you can install it by running apt-get install unzip.

Build and Run Docker Containers

Now that you have everything setup its time to test everything by building and running the docker containers. This is accomplished with docker-compose:

docker-compose build
docker-compose up

 This will run the whole application in the foreground of your terminal. Go to http://localhost:80 and complete the Joomla Installer! You’ll use the mysql username and password you specified in your docker-compose.yml file. The mysql host is also specified in the docker-compose.yml file as the name of the database service. In our case, this is db. Once you’re finished you can use CTRL+C to stop the containers.

Configuring Joomla for Memcached

Now that your Joomla site is running under docker it’s time to connect it to the memcached server to make sure that things stay speedy!

To enable memcached edit

website_sources/configuration.php and replace

public $caching = '0';
public $cache_handler = 'file';

 with this

public $caching = '2';
public $cache_handler = 'memcache';
public $memcache_server_host = 'cache';
public $memcache_server_port = '11211'

 Add your changes to the container image with docker-compose build and then run docker-compose up, log into the Joomla administration page and go to “Global Configuration” -> “System”. You can tweak the settings under “Cache Settings” or leave them as they are.

Running on Server Start

The last step in setting up a web application with docker is to have the web server started when the server starts.

Create the file /etc/init/mydockersite.conf with the contents:

/etc/init/mydockersite.conf
  
description "Website Docker Compose"
author "MichaelBlouin"
start on filesystem and started docker
stop on runlevel [!2345]
respawn
script
  /usr/local/bin/docker-compose -f /var/www/mysite/docker-compose.yml up
end script

Be sure to replace /var/www/mysite/docker-compose.yml with the full path to your docker-compose.yml!

Save the file and run the following to register the service, and to start it:

initctl reload-configuration
service mydockersite start

 And there you go! You can view logs for your service by running docker-compose logs while in the same directory as your docker-compose.yml or by reading the logfile at /var/log/upstart/mydockersite.log.

ByAlexia Pamelov

How to Fix ASP.NET Core cannot find runtime for Framework ‘.NETCoreApp’

CheapWindowsHosting.com | Best and cheap  ASP.NET Core hosting. Today I upgrade my ASP.NET Core application from version 1.0 to version 1.0.1 because of a bug in Entity Framework. Right after updating to the latest ASP.NET Core version, I built the project but ended up with the following error in Visual Studio:

aspnettext

Can not find runtime target for framework '.NETFramework,Version=v4.5.1' compatible with one of the target runtimes: 'win10-x64, win81-x64, win8-x64, win7-x64'. Possible causes:
The project has not been restored or restore failed - run dotnet restore
You may be trying to publish a library, which is not supported. Use dotnet pack to distribute libraries

Fix – Update project.json

After searching around for a few minutes I found issue #2442 on GitHub. This the issue states that you need to update your project.json and you have two options:

(1). Include the platforms you want to build for explicitly:

"runtimes": {
    "win10-x64": {},
    "win8-x64": {} 
},

 (2). Update the reference Microsoft.NETCore.App to include the type as platform:

"Microsoft.NETCore.App": {
    "version": "1.0.1",
    "type": "platform"
}

Conclusion

For more information on .NET Core Application Deployment you can read the docs. This is again another reason I love that all this work is being done out in the open. It really makes finding issues and bugs of this type easier. Hope that Helps!

Best OFFER Cheap ASP.NET 5 Hosting ! Click Here

ByAlexia Pamelov

Free SSL now available with Let’s Encrypt Hosting

CheapWindowsHosting.com | Best and cheap Let’s encrypt hosting. An SSL certificate provides an encrypted connection between the server and the visitor’s browser, and is essential for an e-commerce site (such as powered by WordPress + WooCommerce) in order to protect sensitive customer data and backend administration.

If you don’t have an e-commerce website, it can be difficult to justify the cost of an SSL certificate.  However, as Google now rewards sites with an SSL with higher rankings, websites owners are having to weigh up the cost with the potential benefits.

letsencrypt-logo-large

Why should you use an SSL?

Still unsure if you should use SSL, even if it’s free? Here are three reasons to install a Let’s Encrypt SSL certificate:

  • For security. SSL ensures that visitors information is encrypted and therefore kept private.  With more and more people becoming aware of internet security, visitors feel safer on websites that have a security certificate and show the lock icon in the address bar.
  • For SEO. Google favours sites with SSL and may reward them with a higher ranking.
  • For speed. SSL over HTTP/2 (supported by Create Hosting servers) provides improved load times.  This is good for visitors and search engines

Who is Let’s Encrypt?

Let’s Encrypt is a free, automated, and open certificate authority based on the principles of co-operation, transparency and public benefit.  Let’s Encrypt is sponsored by a number of very well known companies for example Google Chrome, Facebook and Automattic (the company behind WordPress)

How do I activate my Let’s Encrypt SSL?

Let’s Encrypt is provided with the WP Plus plan for all existing and future Create Hosting customers. Currently it can only be setup by an administrator. We can enable this during setup, or at any time – just submit a support ticket and we’ll take care of it for you.

Good news: Plesk and Let’s Encrypt now make it possible for WordPress hosting customers on our WP Plus plan to use an SSL on their website completely free of charge. The Let’s Encrypt SSL needs to be renewed every 90 days, but Plesk takes care of this, automating the SSL re-issuance process every 30 days behind the scenes.

ByAlexia Pamelov

How To Integrate Plesk Onyx with Git

CheapWindowsHosting.com | Best and cheap Git hosting. In this post I will explains How To Integrate Plesk Onyx with Git.

Plesk allows you to integrate with Git – the most popular source code management system used by many web developers. You can manage Git repositories and automatically deploy web sites from such repositories to a target public directory. In other words, you can use Git as a transport for initial publishing and further updates.

Note: This functionality is not supported in Plesk installations running on Windows Server 2008.

To work with Git, you need the following:

  • The Git extension installed in Plesk (for details, refer to the Deployment guide).
  • A domain should be created in Plesk with a service plan granting the Git management permission.

In Plesk, you can add Git repositories of two types depending on the usage scenario:

  • Using local repository on your workstation. In this case, you send the changes from your local repository to Plesk, and then Plesk deploys the changes to your web site. Refer to Using local repository.
  • Using remote Git hosting. This scenario may be useful if you already work with some remote repository in GitHub (github.com) or BitBucket (bitbucket.org). You send the changes to this remote repository, and then Plesk pulls them from the remote repository and deploys to your web site. Refer to Using remote Git hosting.

When you have Git repositories enabled in your domain, the list of created repositories is displayed on the domain’s page. For each repository, the name, the current branch and the deployment path are displayed. The Deploy button near the repository name allows you to deploy the files from a repository (if manual deployment is configured) and the Pull Updates button allows you to pull the changes from the remote repository.

The Git link allows you to manage the domain’s Git repositories.

1lc

Best and Recommended Git Hosting

gt

Founded in 2008, it is a fast growing web hosting company operated in New York, NY, US, offering the comprehensive web hosting solutions on Git Hosting and they have a brilliant reputation in the Git development community for their budget and developer-friendly hosting which supports almost all the latest cutting-edge Microsoft technology. ASPHostPortal have various shared hosting plan which start from Host Intro until Host Seven. But, there are only 4 favorite plans which start from Host One, Host Two, Host Three, and Host Four. Host One plan start with $5.00/month. Host Two start with $9.00/month, Host Three is the most favorite plan start from $14.00/month and Host Four start with $23.00/month. All of their hosting plan allows user host unlimited domains, unlimited email accounts, at least 1 MSSQL and 1 MySQL database. ASPHostPortal is the best Git Hosting, check further information at http://www.asphostportal.com

Save

ByAlexia Pamelov

How to Setting Up Magento for the Search Engines

CheapWindowsHosting.com | Best and cheap Magento Hosting. In this post I will expains more about magento. As a digital agency, we work with Magento every day on both development and search engine optimisation projects. If you’ve used Magento in the past, you know it’s a huge system with lots of menus, drop down options and settings all over the place.

Optimising Magento for search is quite straightforward once you know how to do it, so I put together an easy-to-follow guide that everyone can use to make the process easier to learn. This guide is based around the Magento Community Edition.

For this tutorial, I’m going to assume you have a basic knowledge of SEO, but I’ll also point out selections along the way if you want to read more about specific aspects of SEO that I refer to. 

I’m not going to cover general page layout, heading tags or the actual content you should write. This is a basic article to get the core configuration of Magento correctly setup, and to help people out with some of the most common questions we’re asked with regards to SEO and Magento.

So, log in to your Magento store’s admin panel and let’s get stuck in.

Magento Store Configuration

Most stores that are live will have already carried out a few of these steps. That’s okay, since we are covering the basics, and want to make sure we cover all the bases. Let’s start from the top of the System > Configuration page and work our way down.

Go to System > Configuration > Design > HTML Head. In here, you’ll see the basic fallback settings that Magento has that you can use for SEO purposes. If you haven’t already setup a Favicon, then do that first. It doesn’t affect your SEO, but the standard Magento one doesn’t look great.

The defaults you want to ensure are set here are Default Title, Default Description, and Default Robots

Recommended: We usually fill in the Title Suffix as well with our clients’ brand name. For example, we might put – Pinpoint Designs into the Title Suffix field. This will then be appended to each title tag.

Since the above options are only fallbacks, I would normally recommend putting your company name in as the Default Title, and using a description of your company for the Default Description. It’s very important that your Default Robots is set to INDEX, FOLLOW if your store is live. For a development store, you should switch this to NOINDEX, NOFOLLOW. (Remember to swap it back when you go live, or search engines may choose to ignore your website.)

html-head

Note: While Meta Keywords are not used by many search engines anymore, Magento will roll back to your product names if these aren’t set. For Default Keywords, you can enter your store name as the fallback.

If you’re looking for advice on Meta Titles and Meta Descriptions, take a look at the Moz guides that I’ve linked to here.

Moving on, one of the easiest changes you can make to Magento is to prevent the index.php string from appearing in your main URL. At the same time you change this, you can also force Magento to the www. or non-www. version of your website to avoid duplicates.

To carry out these changes, go to System > Configuration > Web. In here, you’ll see a list of different sections that you can open. We want to open both the URL Options and Search Engine Optimisation sections.

Now set Auto-Redirect to Base URL to Yes (301 Moved Permanently) to automatically get Magento to redirect to your base URL. (So if your base url is http://www.yourdomain.com, it will redirect to the www. version of your website from now on.)

Next, set Use Web Server Rewrites to Yes in order to remove the index.php string from your base URL.

configuration-1

Note: The above changes may not work depending on your server configuration. If in doubt, contact your web hosting provider for assistance.

In order to get the search engines to only recognise one version, we should enable canonical URLs. To do this, go to System > Configuration > Catalog and choose the Search Engine Optimizations dropdown option. There are quite a few options that we can set in here. I’ll explain them very quickly:

  • Autogenerated Site Map – If this is set to enabled, Magento will generate two pages on your site that display links to your products and categories. I would recommend having this option set to Yes.
  • Popular Search Terms – If enabled, this will allow pages to display your most popular search phrases. This setting should be used to target your users, rather than used for SEO purposes. Set to Yes.
  • Product URL Suffix – This is the suffix that is added to the end of your product URLs. Leave the setting as .html.
  • Category URL Suffix – This is the suffix that is added to the end of your category URLs. Leave the setting as .html.
  • Use Categories Path for Product URLs – If enabled, Magento will include the category URL in your URL string. For example, URLs would look like this: yourdomain.com/category-name/product-page.html. I would recommend setting this to No, as leaving it enabled could have adverse effects when using in conjunction with canonical URLs set (especially on larger stores).
  • Create Permanent Redirect for URLs if URL Key Changed – It’s recommended to set this to Yes. This will automatically create a redirect via the URL Rewrites‘ module in Magento if the URL key is changed on any page on your website.
  • Page Title Separator – This is the character that separates the page titles on the front-end of your store. This could be a vertical pipe if you prefer, but I would generally recommend leaving this as a hyphen.
  • Use Canonical Link Meta Tag For Categories – If enabled, a tag will be added to the HTML code on categories displaying the main version of the category page. This is then picked up by search engines to avoid duplicate content. Set this option to Yes.
  • Use Canonical Link Meta Tag For Products – (Same as above for Product Pages.) If you have the categories option above set, then it’s not as important to have this set to Yes, since only one version of a product page will appear. However, to ease your mind, I recommend setting this option to Yes.

search-engine-optimisation-settings

Once you’ve updated these settings, it’s important to reindex the data on your website. To do this, go to System > Index Management. Click Select All and then Reindex Data using the mass action drop down in the top right hand corner of the page.

reindex

XML Sitemap Generation

The easiest way for a search engine to crawl your website is via a sitemap submitted to Google Webmaster Tools, Bing Webmaster tools, Yahoo Site Explorer, etc. As you would expect, Magento will keep your sitemap up to date and generate this for you automatically. In order to enable this, go to System > Configuration > Google Sitemap (under the Catalog heading).

In here, we can configure the priority of each of our pages, along with how often they’re updated and how often we want the sitemap to be updated. This section is a little hard to explain in a tutorial, as it completely depends on your type of store and what you’re primarily optimising.

For the purpose of this article, we’re going to assume your category pages are the most important pages, as these house all of your products and should be optimised for more general terms. We’d next prioritise product pages, as these are specific pages that you want people to hit if they’re looking for a particular item. Finally, we’d have our CMS pages. These are pages that cover information such as terms and conditions, your privacy policy, and shipping information, so they’re generally lower priority. Your homepage also comes under the CMS pages heading.

So, using the above as an example, we’d select the priority and frequency as follows:

Category Options: Frequency set to Daily; Priority set to 1.

Product Options: Frequency set to Daily; Priority set to 0.5.

CMS Page Options: Frequency set to Weekly; Priority set to 0.25.

With the above, if your product catalog and categories don’t change very often, you could drop the frequency down to weekly, but this isn’t necessary.

Note: For the Generation Settings to work, you will need to make sure your Magento cron works correctly.

Next, we need to generate the actual sitemap file. To do this, go to Catalog > Google Sitemap and click on  Add Sitemap Button in the Top Right. Then give your sitemap a name, and put a forward slash in the path file to get it to save in the root directory.

Once done, click Save & Generate and your sitemap should be viewable at yourdomain.com/sitemap.xml.

Assuming it all worked correctly, head over to Google, Bing and Yahoo and submit the sitemap URL you’ve just generated. We’ll add it to the Robots.txt file later.

Additional Notes: If you’re running multiple stores from the same Magento installation, you might want to separate your sitemaps. So using the example of an English and Spanish store, you might call one sitemap-en.xml and the other sitemap-es.xml. You might also want to put these into a subdirectory. If you do this, you will need to make sure that the folder has CHMOD permissions to write. CHMOD 755 should be fine, but you may need to change this to 775 on certain setups. Never set your CHMOD permissions to 777. If in doubt, ask your hosting provider.

Robots.txt

I’m not going to go into huge detail on the Robots.txt file as there’s a fantastic guide written by Inchoo with example templates and different versions explained. Take a look at it and make a judgement call on which Robots.txt file will do the best job for you. You can then modify it to suit your store’s particular requirements.

Remember to update the sitemap URL with the one we just generated (above). This will allow other search engines to pick up your sitemap without the need to submit to them all.

On the above guide, I would strongly recommend using the Inchoo Robots.txt file. That said, it’s important to check everything over before you add it to your store.

Google Analytics

Adding your Google Analytics tracking code to Magento is very straightforward. Head over to http://analytics.google.com and log into your account. Make sure that you have eCommerce tracking turned on. (This can be done by going Admin and clicking on the Ecommerce Settings option which appears under the View heading on the right.)

Once you’ve done this, head over to System > Configuration > Google API to enable the module and check  your UA- Tracking Number. Click Save and you’re done.

Alternative Solution – I would recommend installing the Fooman Google Analytics + module, which is free from the Magento extensions store. This allows you to track AdWords conversions, secondary profiles, dynamic remarketing and more within Magento. If you’re unsure of how to install modules, ask your web developers, or follow this guide. Once installed, go to System > Configuration > Google API and open up the option for GoogleAnalyticsPlus by Fooman. Fooman offers a full guide on how to set this module up, and it’s much better than the standard Magento tracking.

Page Optimisation

Finally, let’s take a look at page optimisation. This is a fairly simple section of Magento where it’s really down to you to come up with some brilliant content and make sure your pages are optimised properly for the search engines. We’ll split this into three sections: CMS Pages, Category Pages, and Product Pages.

  • CMS Pages – CMS pages in Magento are content pages. You generally use them for adding information pages to your site. They can be very powerful and pull in related products, etc., using widgets. As with all pages, it’s important that you optimise them correctly. The key things to look out for are as follows:
    • Page Title – On CMS pages, Page Titles are usually quite straightforward, such as Terms and Conditions or Shipping Information. You can also use these for information pages that drive traffic to your store.
    • URL Key – This is the URL that your page sits on. If you are a company selling plumbing equipment for houses, you might create a piece of content that sits on a CMS page called Radiator Size Guide. The URL Key might then be radiator-size-guide, which would make the URL yourdomain.com/radiator-size-guide/.
    • Content – This is your key area. The phrase “content is king” may be used all the time, but it’s definitely a good cliché to bear in mind. Write good content for your pages, structure it with different heading tags, make it interesting… and the page will be fine. Add images, get your keywords in there, and make it look great.
    • Meta Data (Keywords and Description) – Meta Keywords are not used much in SEO anymore. Most people use them to gather information from competitors to find out what keywords they’re targeting. The main section to fill out here is the Meta Description. Make sure you keep your Meta Description short (150 – 160 characters) and relevant to the page content.
  • Category Pages – Categories are likely the most powerful pages on your store for driving traffic (potentially after your homepage). For this reason, it’s important that you fill them out in full with as much information as possible. The main sections to consider are as follows:
    • Name – This is your category name.
    • Description – Make sure you fill out a full, relevant description of your category. Include keywords that the search engines will pick up, and make sure it’s ultimately beneficial to the user. There’s nothing worse that visiting a website that says “we sell a range of green slippers and blue slippers and yellow slippers and orange slippers and purple slippers for christmas, birthdays, weddings, anniversaries and other slipper related occasions.”  That said, think of your users first… but try to get include your main target keywords in there, too.
    • Page Title – This is your meta title. Keep is fairly short. And remember, if you’ve got your default Title suffix set in Magento, your brand name will automatically be added to the end. Keep it relevant, too, and get your keywords in towards the beginning of the tag.
    • Meta Keywords – As mentioned in the CMS Pages section, these aren’t really required anymore. Fill them in separated by commas if you want to be really keen.
    • Meta Description – This is very important, so make sure it’s completely relevant to your category, covers the products that you’re selling and reinforces your brand. Your Meta Description shows in the search engines and should be used to encourage users to click through to your site. Don’t forget to get your target key phrase in there!
    • Side Note: If you’ve got a category that’s got lots of filters on it, make sure that Is Anchor is set to Yes in Display settings. This will ensure that layered navigation is enabled.
  • Product Pages – Finally, we’ve got Product Pages. These are the key pages that you want to drive traffic to. Try to fill as much information in on these pages as possible. If you’re using Google Merchant Centre to promote your products, you want to make sure you’ve got your product attributes correctly configured to pass as much information back to Google as possible. If you’re just looking to optimise the pages for search, then the following sections are the main areas to look at:
    • Name – Same as above, this is your product name. Try to make it descriptive. Think about what people might search for.
    • Description – This is your full product description. Try to go into as much detail as possible, making your content completely unique, relevant to the product and helpful for the users. If you don’t have HTML experience, use the inbuilt WYSIWYG editor to format the descriptions to look smart. Make sure they’re easy to read, too.
    • Short Description – This is dependent on your product theme. Usually, this is the description that pulls through onto your Category Page. Make sure this is unique, but outline the key features of the product in a sentence or two.
    • URL Key – This is the URL that the product will be visible on. Ensure that this contains the product name, manufacturer and model number if it’s from a wellknown brand. Due to the way we’ve setup the URLs above, this will make the product URLs appear as follows: yourdomain.com/Philips-Sonicare-DiamondClean-Black-HX9352-Rechargeable-Toothbrush. This is quite a long URL, but it contains all the information about the product which will be relevant to users searching for it, including the model number and manufacturer brand.
    • Meta Information Tab (Meta Title, Keywords and Description) – It’s easy to miss the Meta Information Tab in Magento, but it’s important that you always give your products a well-written meta title and description. Keywords are optional (as explained above), but make sure that you keep your titles and descriptions within the correct length.

Key Things to Remember About All the Above Pages

  • Your Page Titles should contain the keywords you want to target. Usually it’s better to have these closer to the beginning of the Page Title.
  • The Page Title should be written for the user, not the search engines. Whilst you’re going to include keywords, make sure they work to provide you a good click through rate from the search engines.
  • If you added your brand name to the Title Suffix as described further up in this article, you will need to take this character limit into account.
  • Meta titles and descriptions should always be relevant to the content on the page. They should be descriptive and encourage people to click through without looking spammy.
  • I personally like to add the brand name into the Meta Description, but this is optional. I think it reenforces the brand name further.
  • Don’t go above the character limit recommended by Google for page titles and descriptions.
  • If you have Magento multi-store set up, all of the above values can be changed on a per-store view basis.

I hope this article has been helpful. Depending on the response, I may do a follow up article that explains the more advanced sections of Magento.

Magento is a very powerful system that is easily scalable, and I work with our clients at Pinpoint Designs worldwide to build and promote their stores with it. So if you have any questions regarding Magento, post a comment below and I’ll respond as soon as possible.

ByAlexia Pamelov

Introduction About Docker

CheapWindowsHosting.com | Best and cheap docker hosting. In this post we will explains everything about docker.

What is Docker?

Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it all out as one package. By doing so, thanks to the container, the developer can rest assured that the application will run on any other Linux machine regardless of any customized settings that machine might have that could differ from the machine used for writing and testing the code.

docker

In a way, Docker is a bit like a virtual machine. But unlike a virtual machine, rather than creating a whole virtual operating system, Docker allows applications to use the same Linux kernel as the system that they’re running on and only requires applications be shipped with things not already running on the host computer. This gives a significant performance boost and reduces the size of the application.

And importantly, Docker is open source. This means that anyone can contribute to Docker and extend it to meet their own needs if they need additional features that aren’t available out of the box.

Who is Docker for?

Docker is a tool that is designed to benefit both developers and system administrators, making it a part of many DevOps (developers + operations) toolchains. For developers, it means that they can focus on writing code without worrying about the system that it will ultimately be running on. It also allows them to get a head start by using one of thousands of programs already designed to run in a Docker container as a part of their application. For operations staff, Docker gives flexibility and potentially reduces the number of systems needed because of its small footprint and lower overhead.

Docker and security

Docker brings security to applications running in a shared environment, but containers by themselves are not an alternative to taking proper security measures.

Dan Walsh, a computer security leader best known for his work on SELinux, gives his perspective on the importance of making sure Docker containers are secure. He also provides a detailed breakdown of security features currently within Docker, and how they function.

The future of Docker

A number of companies and organizations are coming together to bring Docker to desktop applications, a feat that could have wide-ranging impacts on end-users. Microsoft is even jumping on board by bringing Docker to their Azure platform, a development that could potentially make integration of Linux applications with Microsoft products easier than ever before.

Docker 1.0 was released on June 9th, during the first day of Dockercon, and it is considered the first release of Docker stable enough for enterprise use. Along with this launch, a new partnership was announced between Docker and the companies behind libcontainer, creating a unified effort toward making libcontainers the default standard for Linux-based containers. The growth of Docker and Linux containers shows no sign of slowing, and with new businesses jumping on the bandwagon on a regular basis, I expect to see a wealth of new developments over the coming year.

Save