[prev in list] [next in list] [prev in thread] [next in thread] 

List:       ngw
Subject:    Re: [ngw] GroupWise Performance
From:       Gert <gw () gwcheck ! com>
Date:       2004-10-29 20:37:51
Message-ID: 20041029T223751Z_FFB700000000 () gwcheck ! com
[Download RAW message or body]

We have a 30GB PO running on a Compaq Proliant ML370 with 1GB of memory
and 3 - 36GB disks on RAID 5. 
>> Had this hardwsare for a GW60 environment for 5000 users too, with 10 PO's 
on 5 servers, excluding the primary and secondary domains, on separate servers,
but all servers were ML370, 350, or so. 1GB and more disks. 

I run nightly contents checks on the PO and weekly structure check.  
>> Ran nightly analyze/fix and weekly contents fix, though nightly expire/reduce.

I am curious how GW handles its database - does it end up with pockets
of empty space when items are deleted?  If so, are those pockets not
recovered from the contents and structure checks I am doing now?
>> No, there is slack in the database, except if you specify this for a GWCheck,
or when you to a rebuild. Do a rebuild and you see that it saves space. 
Also follow and check the guide lines in the Best practice guide, the Deployment 
guide. You can find those on GWCheck.Com in the Books & Guides column 
or the Best Practices menu (with other useful links for optimizing your system). 

I don't have any performance issues opening messages, but opening
folders with more than 250/350 messages seems a bit on the slow side. 
>> Performance increases a little with QFI. A healthy network helps too I guess ;)
More memory on the client can help too. Caching mode fixes speed there too.

How about running "Reclaim Unused Space" from within the "GroupWise
System Maintenance" screen or something similar??
>> See the above, and how about mailbox quota, or expiring the lenght of saving 
emails ? Archiving, caching mode, mail policy, anti spam and anti virus, clustering, 
and all other challenges there are. 


- Gert
GWCheck.Com

[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic