Jump to content

Riparian

Members
  • Posts

    168
  • Joined

  • Last visited

Profile Information

  • Gender
    Not Telling

Riparian's Achievements

Regular Member

Regular Member (3/5)

0

Reputation

  1. The sitemap is coming from seo powersuite website auditor but it is the same in other programs I can find to check the site . I have tried to fix this many many times over the years on different forums but have never found a way to stop it from duplicating the urls that have parameters after the .php extension. I am in penalty (even though they do not call it that) at the moment and believe that this may well be contributing to the problem. My site has dropped from position one for all 20 main keywords to not even ranking for my most important keywords.... hence why I simply must find a fix cheers
  2. I am afraid that did not help :( I have set the product_display_page to <link rel="canonical" href="https://www.mysite.com.au/great_xxxx.php?page=<?php echo $page ?>"> and the product_details-page to <link rel="canonical" href="https://www.mysite.com.au/expand_xxx.php?model=<?php echo $model ?>"> but as you can see from the attached screen shot it has made no difference to the sitemap information and I assume that the robots will see the same thing... thousands of them. Any thoughts ? Cheers
  3. So it is ok to have <link rel="canonical" href="https://www.mysite/ MyProductsPage.php ? model_number= <?php echo $model_number ?>"/> ??? Thanks
  4. Hi requinix and thank you for replying I actually have canonicals set up in "product display" and assumed that this would solve the problems but no luck unfortunately I cannot use a canonical in "product details" because in "details" it needs the model number which is passed from "display" . Each time the "details" program is accessed without a model number it produces an error so I am assuming that if a bot gets the error message this would be worse that multiple internal links ?? Cheers
  5. This is a little difficult to explain. I have a retail site with about 1000 products. the flow of the site is index>product display > product details > shopping cart > and so on When I try to make a sitemap or other similar programs such as seo powersuite the program finds over 6000 (before i stop it) entries with up to up to 100 links per item because it and the search engines are finding one link on.... page 1 model number 1 ....then page 2 with model number 1..... then page 3 with model number 1 ..... and so on Ad Infinitum What is worse is that the bots also do the same thing on the product display page which I can only assume that google is not too keen on . This has all been created because when I wrote the program dirty URL's was not heard of and now it is too late to change This is something I have struggled with for years but I am hoping that there might be a work around but would like an expert opinion. My Questions is : by putting "noindex" on the product display page it completely stops this behavior BUT..... will this be a real negative to the SEO (not that it could be worse because August 1 algo killed the business anyway !! ) Any help or suggestions are greatly appreciated Thanks
  6. Hi Thanks for the replies. I am aware that this can be done somehow purely because my host runs the exact same thing. If I need support I create a ticket on their site, they reply to me via normal email and from then on we converse using (in my case) using MS Outlook. If anyone can point me in the right direction I would certainly appreciate it Cheers
  7. Sorry that the topic is cryptic but I am unsure how to ask the question. I have written and been using a php client contact ticket system for many years using phpmailer. The problem I have is when I send an email out to a client they reply using the "reply" button in the email program and ignore the "please click here to reply " which takes them to the site where they can log in and be shown all previous correspondence etc etc . What I would like is for the clients to be able to reply using their outlook (or similar) email program but their reply and other details be stored in mysql Is this possible ? Thanks and any help is greatly appreciated
  8. hello kicken You are a legend... thank you so much for this fix ! this one thing has been holding me up for weeks since upgrading from 5.6 to 7. Cheers
  9. Hi and thank you for your reply. Apologies for the code layout Actually the message shows only in excel so there is no way to capture the error that I can think of Please see this link for clarification goo.gl/zBfRvF Cheers
  10. Hi I am Seeking help on exporting mysql files to php7 to make a labels program My code has worked for many years but now breaks with php7 Now, when I run the program all I get is "cannot open protected file" I have tired a number of different headers to no avail This is the code that produces the excel file. the importing of records into the database all works fine. it seems just the export to exce that has changedl <?php require_once('../includes/session.php'); // Get data records from table. $Result=mysqli_query($GLOBALS["___mysqli_ston"], "select * from apo_bulk ") or die ('482.....'.mysqli_error($GLOBALS["___mysqli_ston"])); //go get the data we need... //$Result=mysql_db_query($DBName,$finalSQL,$Link); //fetching each row as an array and placing it into a holder array ($aData) $x=0; while($row = mysqli_fetch_assoc($Result)){ $x++; $aData[] = $row; if($x==1){ $aData[] =array('A','0.5','17','10','6'); $x=0; } } $contents = getExcelData($aData); $filename = "apo_label.xls"; //prepare to give the user a Save/Open dialog... header ("Content-type: application/octet-stream"); header ("Content-Disposition: attachment; filename=".$filename); //setting the cache expiration to 30 seconds ahead of current time. an IE 8 issue when opening the data directly in the browser without first saving it to a file $expiredate = time() + 30; $expireheader = "Expires: ".gmdate("D, d M Y G:i:s",$expiredate)." GMT"; header ($expireheader); //output the contents echo $contents; exit; ?> <?php function getExcelData($data){ $retval = ""; if (is_array($data) && !empty($data)) { $row = 0; foreach(array_values($data) as $_data){ if (is_array($_data) && !empty($_data)) { if ($row == 0) { ###### REMOVED BECAUSE WE DO NOT WANT THE FIRST LINE TO BE HEADER NAMES // write the column headers // $retval = implode("\t",array_keys($_data)); // $retval .= "\n"; } //create a line of values for this row... $retval .= implode("\t",array_values($_data)); $retval .= "\n"; //increment the row so we don't create headers all over again $row++; } } } return $retval; } ?> Any help is appreciated. Cheers
  11. Hi . I have a very old and large site (mostly written with the help of phpfreaks !). I am now told by google that the site will display an "insecure get me out of here" type message if I do not secure the personal info (more than fair) I do not want to secure the whole site as I will lose 10 years of seo because, as I believe, google sees the secure site as a completely different site to the http site... not to mention the duplicate content issues have used this code that seem to work fine BUT when I leave the secure page the https stays for the whole site. # rewrite individual pages to https RewriteEngine on RewriteCond %{HTTPS} off RewriteRule ^login\.php$ https://www.test.com.au/login.php [L,R=301] RewriteEngine on RewriteCond %{HTTPS} off RewriteRule ^checkout\.php$ https://www.test.com.au/checkout.php [L,R=301] Any help is greatly appreciated Cheers and thanks
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.