With the continuing price reductions and performance increase in personal computer, workstation, and server hardware, computational chemistry researchers are able to develop and use more powerful software to study various theoretical problems and to conduct richer simulations of biochemical processes. [unreadable] [unreadable] Current projects include:[unreadable] - LoBoS, high performance computing machine using commodity PC hardware.[unreadable] - Development of parallel QM/MM methods including Replica/Path with Q-Chem [unreadable] - Development and support of the CHARMM computational chemistry software.[unreadable] - Increase the parallel performance of CHARMM via new spatial-decomposition and force-decomposition algorithms.[unreadable] - Expansion of the EMAP method for tomography applications.[unreadable] - Development of image alignment searching and averaging program.[unreadable] - Development of 2D IPS method for membrane system simulation.[unreadable] - Improvement in Self-Guided Langevin Dynamics (SGLD) simulation algorithm. [unreadable] - Development of IPS and SGLD algorithms for AMBER 9.[unreadable] - Development of a general stand-alone Local maximum clustering program.[unreadable] - Improving the efficiency of CHARMM in a Grid Computing Environment.[unreadable] - New multiscale effort, MSCALE, for general multiscale modeling.[unreadable] - Dual force field modeling, CFF with the CHARMM force field[unreadable] [unreadable] [unreadable] A computer cluster with multi-core processors has been designed and procured which has allowed scientists to run larger scale simulations. LoBoS VI, completed in August 2006, added new nodes and a new network based on InfiniBand. This cluster has been a success, delivering measurably greater serial and parallel performance for computational chemistry applications. The next LoBos cluster, LoBoS VII, has been procured in FY07 and, when installed, will expand on this success with with new nodes and new double data rate (DDR) InfiniBand products that will deliver twice the network bandwidth of the existing units.[unreadable] [unreadable] Deploying CHARMM in a Grid Computing Environment. For many biological problems, running of a single simulation provides insufficient data. To enable and facilitate running and analysis of a large number of simulations we have deployed CHARMM in a grid environment. Work has been performed in collaboration with the Centre for Parallel Computing from the University of Westminster who are running the Grid Execution Management for Legacy Code Architecture (GEMLCA) project and with the Open Science Grid project. Custom software has been developed to interface CHARMM with the Open Scienge Grid and to provide workflow and job management. A large scale production series of molecular dynamics simulations has been run with this method.[unreadable] [unreadable] A new multiscale command, MSCALE, has been implemented into CHARMMM program using a client-server paradigm. This allows one to run independent but connected subsystems within the CHARMM framework. The subsystems can be either CHARMM jobs with independent input scripts or a variety of other computational chemistry codes, such as ab initio or molecular mechanics programs. Currently the following ab initio programs have been implemented: NWCHEM, MolPro, Gaussian, Psi, and MPQC. The implementation also allows one to mix different force fields on susbsystems and also different system scales, such as coarse grain, atomic force fields, or QM at the same time. The controlling CHARMM script can run in parallel and the subsystems may also be independent parallel jobs, allowing for further aefficiency gains.[unreadable] [unreadable] As part of the multiscale modeling capability of CHARMM, the code has been extended to enable the use of two different force fields for two different segments of a system. The target was to properly address the solid-liquid multiphase system of protein adsorption on synthetic polymer surfaces. In doing so, the CFF implementation in CHARMM has been fully decoupled so that other class-II force fields such as PCFF, and COMPASS may be used in CHARMM. Additionally, the image bonding part for CFF has now been implemented. The interphase interaction is being tuned with experimental peptide adsorption data on Poly-lactic acid (PLA) polymers. This work is an ongoing collaboration with Prof. Latour at Clemson.