php

PHP, Pear, pspell and a core dump

Sunday, April 6, 2008 

PHP

I’ve been getting core dumps from HTTPD since doing an update which included PHP. This happened to me before and I thought I’d try the same solution again, but it didn’t work. Pear was due an update portupgrade -ra would get to the update and error out. Attempting manually force it was a dead end:
install ok: channel://pear.php.net/Console_Getopt-1.2.2
install ok: channel://pear.php.net/Structures_Graph-1.0.2
*** Signal 11

Couldn’t find any help on pear.php.net except to say it was a PHP problem. That seemed more likely when I found that
# php -v
yielded
segmentation fault (core dumped)

Many fingers point to ZEND, and a few to recode.so but one pointed to pspell.so

I deleted that line from my .../etc/php/extensions.ini and voila:

claudel# php -v
PHP 5.2.5 with Suhosin-Patch 0.9.6.2 (cli) (built: Apr 5 2008 16:51:20)
Copyright (c) 1997-2007 The PHP Group
Zend Engine v2.2.0, Copyright (c) 1998-2007 Zend Technologies

I recompiled all the whole PHP dependency tree with -O2 and still it works fine and I could update pear right to 1.7.1

Posted at 01:56:34 GMT-0700

Category: FreeBSDTechnology

DEMO 08 Palm Desert

Friday, February 1, 2008 

Capsule summaries of the companies presenting at DEMO 08 in Palm Desert. 76 reviews continue past the break (click to expand):

Read more…

Posted at 16:55:51 GMT-0700

Category: ReviewsTechnology

fixing GeoIP for awstats

Monday, October 22, 2007 

https://web.archive.org/web/20101102191506/http://forum.maxmind.com:80/viewtopic.php?t=27 helped, but the real key was hardcoding the database location in geoip.pm line 63: if (! $datafile) { $datafile=”GeoIP.dat”; } to if (! $datafile) { $datafile=”/path/to/GeoIP.dat”; } .

Posted at 10:45:14 GMT-0700

Category: FreeBSDTechnology

updating bothers

Saturday, October 6, 2007 

Apache LogoPHP  Logo
I recently did a portupgrade -ra on my server after some period of complacence. It was instigated by having to clean out my mySQL logs after they ate up 30GB of disk space and caused some table corruption.

Anyway, the key details were that
apache+mod_ssl-1.3.37+2.8.28 > needs updating (port has 1.3.39+2.8.30)
php5-5.2.3 > needs updating (port has 5.2.4)

(among about 50 others)

Some foreshadowing.. once I updated and rebooting I get in /var/log/messages only
kernel: pid 1127 (httpd), uid 0: exited on signal 11 (core dumped)
and in /var/log/httpd-error.log only
[info] mod_unique_id: using ip addr 66.93.181.130
every time I “apachectl start” (and after setting apache.config log level to “debug”)

No go.

Much email searching ensued, but Torfinn Ingolfsen on the free-bsd-stable mailing list suggested looking at PHP. Turned out disabling php.so in httpd.conf got apache sort of working, but that was no help, so I thought, eh, why not migrate to apache 2.2.6?

That helped a lot. First the default config didn’t get run with SSL (crash) but that was hinted in the config files

Oct 02 11:30:26 2007] [info] mod_unique_id: using ip addr 66.93.181.130
[Tue Oct 02 11:30:27 2007] [info] Init: Seeding PRNG with 136 bytes of entropy
[Tue Oct 02 11:30:27 2007] [info] Loading certificate & private key of SSL-aware server
[Tue Oct 02 11:30:27 2007] [error] Server should be SSL-aware but has no certificate configured [Hint: SSLCertificateFile]

so I disabled that for the moment. But I was also getting periodic seg faults from Apache. No details (even less than with 1.3.39). Disabling PHP made them go away, but at least apache 2 was self-restarting, so aside from log pollution, no problem…

It occurred to me that my make.conf -O2 compiler specification might be part of the problem, so I changed just that to -O1 and recompiled with portupdate -rf PHP and no more seg faults. 5.2.3 had no trouble with -O2, but 5.2.4 doesn’t seem stable with O2 optimization.

The SSL problem was that /usr/local/etc/apache22/httpd.conf had a bit at the end about the following being present to support starting SSL… blah blah, 3rd line from the bottom “SSLEngine on.” It was turning on the engine twice since I was using extra/httpd-ssl.conf already. I commented that line out and everything seems fine now.

Posted at 11:00:15 GMT-0700

Category: FreeBSDTechnology

Search Engine Enhancement

Wednesday, September 12, 2007 

Getting timely search engine coverage of a site means people can find things soon after you change or post them.

Linked pages get searched by most search engines following external links or manual URL submissions every few days or so, but they won’t find unlinked pages or broken links, and it is likely that the ranking and efficiency of the search is suboptimal compared to a site that is indexed for easy searching using a sitemap.

There are three basic steps to having a page optimally indexed:

  • Generating a Sitemap
  • Creating an appropriate robots.txt file
  • Informing search engines of the site’s existence

Sitemaps
It seems like the world has settled on sitemaps for making search engine’s lives easier. There is no indication that a sitemap actually improves rank or search rate, but it seems likely that it does, or that it will soon. The format was created by Google, and is supported by Google, Yahoo, Ask, and IBM, at least. The reference is at sitemaps.org.

Google has created a python script to generate a sitemap through a number of methods: walking the HTML path, walking the directory structure, parsing Apache-standard access logs, parsing external files, or direct entry. It seems to me that walking the server-side directory structure is the easiest, most accurate method. The script itself is on sourceforge . The directions are good, but if you’re only using directory structure, the config.xml file can be edited down to something like:

<?xml version="1.0" encoding="UTF-8"?>
<site
  base_url="http://www.your-site.com/"
  store_into="/www/data-dist/your_site_directory/sitemap.xml.gz"
  verbose="1"
  >

 <url href="http://www.your-site.com/" />
 <directory
    path="/www/data-dist/your_site_directory"
    url="http://www.your-site.com/"
    default_file="index.html"
 />

Note that this will index every file on the site, which can be large. If you use your site for media files or file transfer, you might not want to index every part of the site. In which case you can use filters to block the indexing of parts of the site or certain file types. If you only want to index web files you might insert the following:

 <filter  action="pass"  type="wildcard"  pattern="*.htm"           />
 <filter  action="pass"  type="wildcard"  pattern="*.html"          />
 <filter  action="pass"  type="wildcard"  pattern="*.php"           />
 <filter  action="drop"  type="wildcard"  pattern="*"               />

Running the script with

python sitemap_gen.py --config=config.xml

will generate the sitemap.xml.gz file and put it in the right place. If the uncompressed file size is over 10MB, you’ll need to pare down the files listed. This can happen if the filters are more inclusive than what I’ve given, particularly if you have large photo or media directories or something like that and index all the media and thumbnail files.

The sitemap will tend to get out of date. If you want to update it regularly , there are a few options: one is to use a wordpress sitemap generator (if that’s what you’re using and indexing) which does the right thing and generates a sitemap using relevant data available to wordpress and not to the file system (a good thing) and/or add a chron script to regenerate the sitemap regularly, for example

3  3  *  *  *  root python /path_to/sitemap_gen.py --config=/path_to/config.xml

will update the sitemap daily.

robots.txt

The robots.txt file can be used to exclude certain search engines, for example MSN if you don’t like Microsoft for some reason and are willing to sacrifice traffic to make a point, it also points search engines to your sitemap.txt file. There’s kind of a cool tool here that generates a robots.txt file for you but a simple one might look like:

User-agent: MSNBot                             % Agent I don't like for some reason
Disallow: /                                    % path it isn't allowed to traverse
User-agent: *                                  % For everything else
Disallow:                                      % Nothing is disallowed
Disallow: /cgi-bin/                            % Directory nobody can index
Sitemap: http://www.my_site.com/sitemap.xml.gz % Where my sitemap is.

Telling the world

Search engines are supposed to do the work, that’s their job, and they should find your robots.txt file eventually and then read the sitemap and then parse your site without any further assistance. But to expedite the process and possibly enhance search results there are some submission tools at Yahooo, Ask, and particularly Google that generally allow you to add meta information.
Ask
Ask.com allows you to submit your sitemap via URL (and that seems to be all they do)
http://submissions.ask.com/ping?sitemap=http://www.your_site.com/sitemap.xml.gz


Yahoo
Yahoo has some site submission tools and supports site authentication, which means putting a random string in a file they can find to prove you have write-access to the server. Their tools are at
https://siteexplorer.search.yahoo.com/mysites


with submissions at
https://siteexplorer.search.yahoo.com/submit.php


you can submit sites and feeds. I usually use the file authentication which means creating a file with some random string (y_key_random_string.html) with another random string as the only contents. They authenticate within 24 hours.
It isn’t clear that if you have a feed and submit it that it does not also add a site, it looks like it does. If you don’t have a feed you may not need to authenticate the site for submission.
Google
Google has a lot of webmaster tools at
https://www.google.com/webmasters/tools/siteoverview?hl=en


The verification process is similar but you don’t have to put data inside the verification file so

touch googlerandomstring.html

is all you need to get the verification file up. You submit the URL to the sitemap directly.
Google also offers blog tools at
http://blogsearch.google.com/ping


Where you can manually add the feed for the blog to Google’s blog search tool.

Posted at 13:25:13 GMT-0700

Category: FreeBSDTechnology

ZoneMinder on FC7

Saturday, August 25, 2007 

26828649.jpg

Overview
Zone Minder Config ZoneMinder 1.22.3 on Fedora Core 7

There are useful instructions at this URL

Do a basic install of FC7.

  • KDE seems to work better than gnome.
  • Remove unnecessary SW to speed install (desktop stuff)
  • Add Server and Development to get the right tools
  • Add https://www.systutorials.com/additional-repositories-for-fedora-linux/ as an RPM source
  • Make sure the necessary holes are in the firewall at 80

Add necessary bits
yum install mysql-server mysql-devel php-mysql pcre-devel \
perl-DateManip perl-libwww-perl perl-Device-SerialPort \
perl-MIME-Lite perl-Archive-Zip

updating perl (some will be installed already)
perl -MCPAN -e shell
install Bundle::CPAN
reload CPAN
install Archive::Tar
install Archive::Zip
install MIME::Lite
install MIME::Tools
install DateTime
install Date::Manip
install Bundle::libnet
install Device::SerialPort
install Astro::SunTime
install X10
quit

FFMPEG install

Note that getting the FFMPEG libraries installed so they work is a nightmare. I followed these instructions and they seemed to work:

First add the x264 libraries and devel from livna via software manager

If the database hangs try
rm /var/lib/rpm/__db*
rpm --rebuilddb
yum clean all

svn checkout svn://svn.mplayerhq.hu/ffmpeg/trunk ffmpeg
cd ffmpeg/

./configure --enable-shared --enable-pp \
--enable-libx264 --cpu=pentium3 --enable-gpl

make
make install
nano /etc/ld.so.conf

add the line “/usr/local/lib”
ldconfig

System demons

chkconfig --add mysqld
chkconfig --level 345 mysqld on
chkconfig --level 345 httpd on
service mysqld start
service httpd start

add to /etc/sysctl.conf to increase shared memory limit
kernel.shmall = 134217728
kernel.shmmax = 134217728

Zoneminder Install

Check the latest version of zoneminder at https://web.archive.org/web/20110310142847/http://www.zoneminder.com:80/downloads.html

wget https://zoneminder.com/downloads/
tar xvfz ZoneMinder-1.22.3.tar.gz
cd ZoneMinder-1.22.3

patch it

https://web.archive.org/web/20110501001851/http://www.zoneminder.com:80/wiki/index.php/1.22.3_Patches

The configure command I used is:
./configure --with-webdir=/var/www/html/zm \
--with-cgidir=/var/www/cgi-bin ZM_DB_HOST=localhost\
ZM_DB_NAME=zm ZM_DB_USER=zmuser ZM_DB_PASS=zmpass \
CFLAGS="-g -O3 -march=pentium3" CXXFLAGS="-g -O3 \
-march=pentium3" --with-ffmpeg=/usr/bin \
--with-webuser=apache --with-webgroup=apache

putting a reasonable user name for “zmuser” and password for “zmpass”

make
make install

If make barfs with
/usr/local/src/ZoneMinder-1.22.3/src/zm_mpeg.cpp:284: undefined reference to `av_free(void*)'
try
”in src/zm_mpeg.h starting on line 26, add the lines with the + (removing the + of course) The other lines are just for reference and should be already in the file.” from this reference (lost to the void, alas).

nano src/zm_mpeg.h

#define ZM_MPEG_H
+extern "C" {
+#define __STDC_CONSTANT_MACROS
#include <ffmpeg/avformat.h>
+}
#if FFMPEG_VERSION_INT == 0x000408

Install scripts
install scripts/zm /etc/init.d/
chkconfig --add zm

Create and configure the ZoneMinder database
mysql mysql < db/zm_create.sql
mysql mysql

at the mysql prompt:
grant select,insert,update,delete on zm.* to \
'zmuser'@localhost identified by 'zmpass';
quit

mysqladmin reload

GO!
service zm start

you should get a nice green [OK].

http://127.0.0.1/zm

Black Screen? Go Faster?
No php?
If you have issues make sure you have installed apache php and perl modules.

IJG SIMD jpeg should double performance.
http://cetus.sakura.ne.jp/softlab/jpeg-x86simd/jpegsimd.html#source
* requires nasm which wasn’t installed. Use package manager.
wget http://cetus.sakura.ne.jp/softlab/jpeg-x86simd/sources/jpegsrc-6b-x86simd-1.02.tar.gz
tar xvfz jpegsrc-6b-x86simd-1.02.tar.gz
cd jpeg-6bx
./configure --enable-shared --enable-static
nano Makefile

* Change the CFLAGS from O2 to O3 and add
-funroll-loops -march=pentium3 -fomit-frame-pointer

make
make test
make install

identify the libraries to the system
ldconfig

I also copied the installed files from /usr/local/bin to /usr/bin:
cp /usr/local/bin/cjpeg /usr/bin/cjpeg
cp /usr/local/bin/cjpeg /usr/bin/cjpeg
cp /usr/local/bin/cjpeg /usr/bin/cjpeg
cp /usr/local/bin/cjpeg /usr/bin/cjpeg
cp /usr/local/bin/cjpeg /usr/bin/cjpeg

/etc/init.d/zm restart

NetPBM resizes the JPEGS and faster is better: compile and install
cd /usr/src
svn checkout https://netpbm.sourceforge.net/ netpbm
cd netpbm
/usr/src/netpbm/configure

Answer the questions (GNU and then defaults – I didn’t have TIFF or VGA libs, so “none”)
vi Makefile.config
I added -march=pentium3 to the CFLAGS at the end of the file
make
make package
/usr/src/netpbm/installnetpbm

accept defaults

cabozola install

* package add Ant (it needs ant, but it wasn’t installed by default)
cd /usr/src
wget https://web.archive.org/web/20220526174548/http://www.charliemouse.com/code/cambozola/cambozola-latest.tar.gz
tar xvfz cambozola-latest.tar.gz
cp /usr/src/cambozola-0.68/dist/cambozola.jar /var/www/html/zm
chmod 775 /var/www/html/zm/cambozola.jar

Posted at 01:44:34 GMT-0700

Category: LinuxTechnology