Results 1 to 6 of 6

Thread: 'Crawl Errors' From Google??
      
   

  1. #1
    Join Date
    Oct 2008
    Posts
    42

    Default 'Crawl Errors' From Google??

    www.cgmulleralgarve.com

    I recently added some pages to my existing website, as far as I can tell there are no bad links, which I created. Yet google tells me, that there are 77 bad links pointing to 404 error.

    How can I solve this issue? should I suspect a hack?

    thanks for your time

  2. #2
    Join Date
    Mar 2006
    Posts
    14,683

    Default Re: received crawl error from google

    Many of the Page Titles you have created improperly have extra spaces in them rather than a hyphen ("-") or an underscore ("_") as required, and as such, each of your hyperlinks (although functional) are also improperly created and are the cause of the Crawl Error Reporting.

    Here is one such erroneously created Page Title (URL) with errors highlighted (you will note how how the whitespace has been automatically "filled" [URL encoding replaces non ASCII characters with an ampersand ("%") followed by two hexadecimal digits ("&20")] as no symbols or spaces are permitted in Page File Titles or Hyperlinks:
    http://cgmulleralgarve.com/index%20for%20claude.html > should be > http://cgmulleralgarve.com/index-for-claude.html

    Also, these pages lack a META Page Title, and this contributes to pages not being "found" (404) according to relevance. You need to create a Page Title via Page Properties (different than the page name you save it as in BlueVoda) for each of these incomplete pages.

    You'll need to re-Title each page properly (saved File name and META Title in Properties), and re-publish them with previously corrected site-wide navigation applied.


    Do not rely heavily on such extraneous tools ... they are notoriously unreliable, as you can see by the easy-to-identify errors being classified improperly (if one knows what to look for -- most do not, and thus these Tools rarely provide cogent usefulness). Instead, be sure to follow the Tutorials and advisements freely offered by VodaHost precisely and without interpretation or missing any steps to achieve predictable results for optimal website performance.
    . VodaWebs....Luxury Group
    * Success Is Potential Realized *

  3. #3
    Join Date
    Oct 2008
    Posts
    42

    Default Re: received crawl error from google

    thank you, Vasili, for your quick response. I did all the changes as recommended by you, I hope.
    Still feel I have to correct something with Google. I had an incident in 2011, when my website got hijacked and it seems that I am still removed from the search engine, not sure here what to do, I already asked for a re-evaluation.
    What would you suggest?
    thanks again

  4. #4
    Join Date
    Mar 2006
    Posts
    14,683

    Default Re: 'Crawl Errors' From Google??

    I personally would not intentionally reach out to Google and request any extra attention without cause: just because you corrected site errors that were of your creation (simple site basics that were not taken care of and may or may not have been contributing to a less than optimal cache evaluation) does not mean that you merit re-evaluation and possible increased SERP (Search Engine Results Page = position on the Search page) when your site has never been singularly penalized ..... it is like asking for a battery of tests at the doctor's office just because you have a cold -- you may not like the surprises revealed: your site may have other serious errors that both you and I missed which would only perpetuate a negative ranking without offering a recourse for repair!

    It is better to let changes happen "naturally" in the normal timeframe, even if that timeframe and manner has in fact been very carefully orchestrated.

    A. Re-visiting the idea that there are 'hidden' errors yet in your site that stand out as "non-compliant build" or other serious flaw that may prevent your site from being valued normally by simple default, I would suggest that the abnormal method you used to create this "additional artist's site" is likely to pose an obstacle until it is fixed. It is highly irregular to include a non-relevant section of a website in the manner that you did by simply naming a page "index-for-claude" (even though the page files titles and the hyperlinks were corrected) when you should have instead published them to a separate Sub-Directory within the website (seen as http://www.cgmullalgarve.com/claude/), and more ideally to a Sub-Domain (which is seens as http://claude.cgmullalgarve.com) in order to keep the Content continuity intact and any Relevance separate from your Main website. THIS is likely a major obstacle to your site being deemed in full compliance and worthy of elevated SERP, as the SE's cannot understand how this artist site has anything to do with the Villa site relevance. Do you understand this concept?? You should review this thread for more details that outline the concept and practical logic of creating a Sub-Domain for SEO compliance and implement this solution before implementing the additional important suggestion I mention immeditately below.

    B. The normal cache cycle is between 28 and 42 days, and the fact you have updated your site (significantly), these dynamic changes (adding pages, adding proper site format with creating the Sub-Directory) should register with the SE's and spur an updated sitewide evaluation. If you do not yet have an updated and accurate sitemap.xml file, a robots.txt file, and a sitemap.html file installed on your site to interact with the SE's (which you can also program to command them to visit on a specified basis for complete caching), I suggest you create these files and install them immediately after making all the corrections and updates to your website. The fact you have these files that instruct the SE's and bots how to value your site should be another added dynamic to re-institute proper recognition if not repaired SERP, and this is the most effective means to not just comply with site Build standards, but interact directly with the SE's and bots that have a large share of how your site is valued. It is like talking to them in their language, only telling them what you think they should be doing instead -- which works to varying degrees, but still it works and does not ever result in negative results (simply complying with standards in an advanced manner -- which in your case, should supercede any nefarious valuations previously cached).
    Refer to this thread for details on how to get these important tasks done> Sitemaps: HTML, XML, etc ??

    C. Remember that if Claude's Sub-Domain website has been properly updated to compliant standards and is formatted correctly, it too should have these same 3 files in place in order to establish the proper 'independence' and SEO values that are necessary for each site. Just be sure that sub-site has in fact been published with no errors or irregular formatting (remember to change the first page of the Sub-Domain or Sub-Directory to simply 'index' just like you do for any website), for it is still "associated' with your site and should not be allowed to be a source of negative valuation that is attributed to your site.


    CORE OPTIMIZATION STRATEGY

    For elevated SERP position and greater WWW appeal, every site should implement an intelligent Optimization Strategy that is not only compliant with universal "Build Structure Standards" but uniquely refined using the basic "Core SEO Characteristics" methods that have recently become re-emphasized by Google as they attempt to eliminate unfair advantages and to valuate the explosion of new sites added daily at the same time. Optionally included in these methods should be a minimally developed Backlink Structure and/or a cogent Reciprocol Link Structure, but in the very least every site should have in place Core SEO entries to prove compliant with SE and WWW expectations.
    You should have these Core SEO characteristics implemented within the Rules properly installed on your site without exception:
    1. Keyword META TAG (unique to every page)
    2. Page Description METATAG (unique to every page)
    3. Author METATAG
    4. Individual Page Titles
    5. h-METATAGS (considered minimal Content relevancy development)
    6. Global Site Navigation Scheme on every page
    7. Privacy Policy (required by Law) and Transaction Policy (required by Law if applicable)
    8. XML Sitemap file (W3C compliant coding format: sitemap.xml)
    9. Robots.txt file (even if left unfdefined, or "empty-formatted")
    10. HTML Sitemap file (W2C complaint coding, the redundant solution for the various Bots that strictly read HTML and not XML: sitemap.html)

    The elements "Required By Law" need to be implemented on every website hosted with VodaHost, as all physical hosting is provided by servers located in the United States and thus US Law has universal jurisdiction without exception. Compliance is mandatory as part of VodaHost's publicly published Terms of Service.
    . VodaWebs....Luxury Group
    * Success Is Potential Realized *

  5. #5
    Join Date
    Oct 2008
    Posts
    42

    Default Re: 'Crawl Errors' From Google??

    Thank you very much, Vasili, for the extensive info. I see alot of work ahead of me.

    Claude is my better half and the original idea was to fade out the initial domain ( within a month time) and maybe, if possible change the domain name to Claude, than I would remove the first part and the art part is a working site. Since I'm not the brightest candle for SE and the workings of it, I might have the wrong idea?

    Nevertheless, I find your info very valuable and will keep on working on it.

    Thanks again,
    kind regards

  6. #6
    Join Date
    Mar 2006
    Posts
    14,683

    Default Re: 'Crawl Errors' From Google??

    ..
    The best solution is always the one that is the cleanest, the most direct, and clearly the most logical.

    OPTION A - (ACCEPTABLE)

    If your intention was to allow the 'Villa' site to lapse due to the units being sold, that is one thing, but since it is permanently associated as Primary Domain on your hosting account, you may wish instead to renew it when it is time and simply convert the site into a showcase site, which would remain as a site showing past accomplishments or even as a site converted to be a type of "management" site for the properties (maybe to manage or promote the rentals after they've been purchased?) ... or some other type of site conversion that makes sense to you ... the point is to keep your Primary Domain active so there is no conflict or functionality issues on your single hosting account.

    If you choose to follow this option and renew the Villa Domain (and the hosting account as it is structured now), the simple alternative of creating a Sub-Domain or a Sub-Directory on the Villa site (if you simply renew the Domain and maintain the site as-is) does not require a Domain to be purchased, but it will require the inconvenience of having the new "art site" associated with the 'Villa" site (which may BOTH be confusing to some Visitors for either site, as they are completely different in nature and they do not compliment the other with regard to Search Engine valuations or relevance-building account-wide, which is a serious optimization error).

    For this critical reason, it is suggested that you purchase a new Domain to be added as an Add-On Domain for the Art site rather create a Sub-Domain or Sub-Directory to publish it to.

    If you intend to take this course, then the following steps should be obvious to you:
    1. immediately remove the "Art" site from your "Villa" site completely, as it is seriously interfering with the SE values for each (we have already discussed your dilemma in depth above that needs be resolved);
    2. purchase the "new Domain" for the Art site, create an Add-On Domain within the Villa hosting account;
    3. Publish the Art site to this new Add-On Domain hosting only after it has been properly developed for minimal SE compliance (outlined above as the 10 Core Optimization Strategy Elements);
    4. re-develop the Villa site using all 10 Core Optimization Strategies, and re-publish likewise.


    OPTION B - (BEST)

    If maintaining the Villa site in any form is not feasible, then I would strongly suggest simply purchasing a new hosting account altogether (you get a new domain name free for the first year when you do this) and associating the new Domain name as Primary (one you intend to keep longer than for the Villa project, which was 'temporary' as it was based wholly focused upon selling the units), for the Art site. You can simply publish both sites from the saved BlueVoda files you have already on your computer, and there will be nothing missed or lost during the transition.
    If you choose, you can add as many new websites to this new hosting account as Add-On's, as long as your Primary Art site is not "temporary" in nature as was the Villa site. Truth be told, it is not really a big deal to 'walk away' from your existing hosting account to purchase a brand new one to start over fresh ... the profits from selling the units themselves certainly are enough to cover the expense that was earmarked specifically for this purpose!


    These are the clearest and most direct courses of action to solve all your issues, now that they have been revealed in full.

    If needed, refer to this thread for all the specific steps and methods that you should implement to achieve the optimal performance from each of your compliant and dynamic websites created using BlueVoda ....


    Complete BlueVoda Tutorial Archive

    Complete cPanel Tutorial Archive (creating Add-On Domains, Sub-Domains, etc.)



    THREAD CLOSED
    . VodaWebs....Luxury Group
    * Success Is Potential Realized *

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

     

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •