As all of you know after upgrading the database server from one version to other major version, ANALYZE command should be executed to update the pg_catalogs on newly populated data. On a huge upgraded database, its a challenge for the application to gain its performance back without updating the statistics. In PostgreSQL 9.4, an option “vacuumdb –analyze-in-stages” will make this work faster to produce usable statistics required by the optimizer. It runs in three stages with different configuration settings(default_statistics_target/vacuum_cost_delay) to analyze the database.
PostgreSQL users access to the database goes through libpq library. It also has a feature of allowing automation for users without prompting password by keeping their details in .pgpass file. Lets see…
Points to be considered when setting .pgpass file.
1. It should be created in postgres user home directory.
2. It should disallow any access-level to world or group.
3. Location can be controlled with PGPASSFILE environment variable.
Syntax to follow in .
Back again, getting very less time for blogging :)
“ERROR: could not create unique index
DETAIL: Table contains duplicated values.”
This error is thrown out by Postgres when it encounters duplicate rows in a primary key table by failing any of these command REINDEX or CREATE UNIQUE INDEX.
Why duplicate rows exists in a table ?
Not sure exactly :) nor any proved explainations out…
Two thing to my mind.
Note: Recommended not to tamper pg_catalogs.
On forum, I saw an interesting posting and also the solution, however few things of that solution made me to test it. Scenario is, “How to resize the VARCHAR column on a large table with less time and what are best approach’s”. As known standard way is to, Create a NEW column with desired size, Copy OLD data to newly created column, Drop the OLD column and finally rename the NEW with OLD column name.